22 comments

  • gehwartzen 18 hours ago
    At this point I get about 1-2 emails a year telling me some company has exposed my private data in some way. It’s completely routine.

    We need a law mandating the company pays at least $1k per exposed record per customer or absolutely nothing will change. The current cost of “here’s a years worth of credit monitoring” doesn’t even amount to a slap on the wrist.

    • rolandog 18 hours ago
      And tied to inflation (or to a % of gross income), too, otherwise it'll be cheaper in X years to get fined than to hire information security officers
    • overfeed 12 hours ago
      > We need a law mandating the company pays at least $1k per exposed record per customer or absolutely nothing will change.

      That won't change a single thing, except for shell-company shenanigans, more frequent bankruptcy proceedings, and the same people coming back trading under a new name and logo. A law sending people to prison may actually change things.

      • troad 7 hours ago
        "Oh you want to make a little start up to share recipes between friends or whatever? Aww, that's cute. Well, here's the OAuth spec and an incomplete list of footguns. I hope your grasp of elliptic curves is strong. Prison time if you fail."

        The absolutely only consequence of laws that criminalise mistakes in handling of PII is to force everyone to externalise auth to the likes of Auth0. And you can bet your ass that if this ever happens, the likes of Auth0 will lobby like hell to never ever repeal or update those laws, being a vast corrupt funnel of business to them.

        Congrats, you've created a new Inuit.

      • JeremyStinson 11 hours ago
        All those people have high-priced lawyers that will keep them out of prison. The DBA and the Data Engineer will be the ones who go to jail for "Not ensuring all applicable data security controls were configured, and enabled, to prevent the detection, collection, and modification of any and all data assets within the purview of Company X, all its holdings and subsidiaries."
        • lucyjojo 8 hours ago
          force nationalization of the business for egregious cases.
  • cataflam 23 hours ago
    Almost a month old, original source: https://cybernews.com/security/global-data-leak-exposes-bill...

    and I've never seen any confirmation elsewhere

    Looks like CyberNews have edited the article with more info since first I saw it, it used to look quite suspicious and untrustworthy, it now has more info. Still doesn't say exactly what a record is, or how many uniques there are.

    • frereubu 21 hours ago
      I presume the database exists, but some of the details don't add up. IDMerit say "IDMERIT’s systems and security infrastructure have never been compromised", "there has never been a data breach or exfiltration from [our partners'] systems during, before, or after this event" and "IDMerit does not own, control or store customer data". But Cybernews says that they "promptly secured the database" after being notified. Cybernews also didn't give the reason why they thought this was to do with IDMerit (unless I missed it). I can't quite make head nor tail of it.
    • 0xbadcafebee 18 hours ago
      To sum up the updates in the article

        - IDMerit asked the security researcher for proof, the researcher asked for money first, so IDMerit balked
        - IDMerit basically says they have no proof they were hacked, so they weren't
        - The researcher is a freelancer... for CyberNews...
      
      Even if somebody followed up with IDMerit, it's likely they will say they are not affected. The security researcher is probably the only person who could prove whether they were or not vulnerable, at this point. If they don't come forward, we can only assume they weren't vulnerable, but we don't know. This is a good lesson for responsible disclosure in the future.

      ...also, this is yet another example of why we need a regulated Software Building Code, with penalties for not conforming to it. If somebody is found to be hosting a public Mongo instance with no authentication, it should be reported to a state or federal agency, so that real penalties can be applied, the way they are for other code violations. And they shouldn't have been allowed to launch with that in the first place. It shouldn't be up to random "security researchers" to police businesses.

    • tootie 22 hours ago
      It's a weird article. For one, the researcher says "they believe" the data belongs to IDMerit but apparently aren't sure. IDMerit denies it's the owner of the data nor is it any of their partners. And there's very few details about where or how they found this database. It's possibly some kind of hoax or ransom attempt? Or there's really just billions of unaccounted databases of private data just sitting all over the Internet.
      • uean 21 hours ago
        The cybernews article does have some screenshots showing names like “idmb2c” … also that IDMerit was contacted in November and the ports were closed a day later.
  • neya 22 hours ago
    If I was in Vegas, I would bet my life savings that the CXOs of the said ID Verification company's data isn't included in the leak. This is just like that Mc Donald's CEO's video - they never use what they create.
    • submerge 19 hours ago
      I bet their data is included too, for two reasons:

      First, identity verification data for KYC is a little bit different from fast food or social media in that it's very difficult to live a normal life without being subject to any KYC checks. (I'm sure someone will chime in that they get paid in bitcoin and buy their groceries with cash.) If you are applying for some financial product or service that requires KYC, and they can't find any information about you, you will often either be denied that product or have to jump through a bunch of additional hoops to prove who you are. So it benefits CXOs to have their data included in these datasets, in fact if they are well paid they may well have more activity requiring KYC checks than the average person.

      Second, and much more simply, one's own data often makes for a good test case since you know its accuracy.

      • neya 5 hours ago
        I am not debating that they don't need KYC, I'm simply saying they probably use a more secure alternative than their own.
    • ezst 21 hours ago
      Or the tech executives barring their children from using social media.
      • neya 5 hours ago
        Absolutely!
  • egorfine 23 hours ago
    KYC = Kill Your Customer.
  • whatsupdog 22 hours ago
    Where the F does IDMerit even get all this data from? They have names, DOBs, addressed, phone numbers, national identity numbers for over a billion people? How?
    • wongarsu 22 hours ago
      The 1B number would contain multiple records per person.

      For example if I (as a German in Germany, ymmv) open a bank account online that involves a call with one of these companies where they take pictures and information from my passport and check that that's me. Then I choose payment in installments on some online shop, same game. Apply for a small loan? Same game. Set up an account for trading (stock exchange or crypto)? You guessed it, another call. Another payment in installments, backed by the same bank? Apparently verifying my identity again is easier than checking their database. Each of those is another record. Potentially with a new identity document, address or even name (maybe you got married) but mostly just the same data confirmed again with another timestamp

      Not all of them use the same identity verification service, but there aren't that many. And I wouldn't be surprised to learn that many are the same company under different brands

    • shakna 22 hours ago
      A record is not necessarily unique. Name changes, address changes, phone number changes, can all create "new" records in dumps like these.
    • uean 22 hours ago
      Makes sense if the ID verification process involves scanning a driver license or passport.

      Edit- rereading this, you’re obviously talking about scale. The original article is much better : https://cybernews.com/security/global-data-leak-exposes-bill...

  • gregbot 18 hours ago
    This made me absolutely livid:

    > We requested a security incident report from the ethical hackers as proof

    So instead of paying him a fair bug bounty, they demand that he write a formal report for them and prove to them that there is even a problem.

    Totally unhinged, but it gets worse:

    > the response was a demand for money for the report, which confirmed our suspicion that this was a ransom-related incident.

    Wow. So when the security researcher informs them that he would be happy to do some consulting work for them and informs them of his rates, they flip out and accuse his initial good samaritan decision to inform the company of the issue of being part of a plot by him to hold the company for ransom?

    Whoever thought this is both totally delusional and a complete jerk. Truly, no good deed goes unpunished.

  • chikinpotpi 20 hours ago
    Nobody told their marketing department:

    https://www.idmerit.com/blog/idmerits-data-breach-fail-safe-...

    archived for posterity: https://archive.ph/MdSfO

  • danlitt 17 hours ago
    > We own and operate our proprietary platform, but we do not own, control or store customer data or the underlying data maintained by independent data sources.

    This seems like a critical sentence. Is this database actually operated by IDMerit, or someone else? If so, who?

  • ericwebb 16 hours ago
    Remember when you'd get a letter in the mail, "you identity has been compromised, here is a subscription to an identity monitoring service."

    The system is broken. We shouldn't be so vulnerable because of foundational infrastructure.

  • rmnclmnt 20 hours ago
    Unrelated to the story but TIL AOL is still a thing in 2026!
    • xbar 19 hours ago
      Seems like it deserves to be its own post.
  • pirate787 21 hours ago
    While this leak may or may not have happened, for this type of exposure there should be criminal liability for developers and executives. Criminal negligence and prison time.
    • outime 21 hours ago
      If developers are going to face criminal liability, they should IMHO also have legal ways to push back against certain implementations without risking their jobs, or at least have a way to leave a legal justification somewhere: "I'm doing this because I'm forced to but I disagree" which is then signed by management.

      Until then, you're putting the weight of the law on the wrong side of the equation, since developers aren't the ones consciously making risky decisions.

      • danlitt 17 hours ago
        Most countries already have whistleblower laws. If you are living somewhere that has any kind of "wrongful termination" legislation, an employer asking you to commit a crime is an open and shut case. I would guess that all of the USA and Europe would have existing sufficient protections, for example (although the US never ceases to surprise me).
      • activestore 16 hours ago
        They won’t do it just for protection.

        The state would need to offer an award, and maybe witness relocation

  • kevincloudsec 16 hours ago
    every age verification mandate creates another one of these databases. billion records, no password, plain text.
  • bilekas 22 hours ago
    > That review identified no exposure, vulnerability or unauthorized access within the IDMERIT environment

    The fact that they didn't vet their data providers then has to be considered a form of negligence. In the end, its the company I am handing over my details to to act responsibly, not their providers.

    I hate this responsibility delegating when its not a good luck, and this will continue to get worse now as the entire internet will be ID gated soon. But don't worry, all the lapse in privacy and even security in the name of 'saving the kids'.

  • djohnston 21 hours ago
    aol.com!?!?
  • jajuuka 18 hours ago
    Unprotected MongoDB, tables without password, data in plain text. It's a textbook example of doing absolutely everything wrong.
  • plagiarist 19 hours ago
    Yet another point of proof that the US needs a HIPAA covering PII.
  • stopbulying 10 hours ago
    Do such breaches make it trivial to lie to age and identity verification systems?
  • mbix77 23 hours ago
    What did measures like gdpr ever achieve except for making me click a cookie prompt away.
    • Rygian 23 hours ago
      Actual punitive measures taken against entities who e.g. manipulate personal data in a negligent way. [1]

      Which was much harder to achieve before.

      [1] https://www.enforcementtracker.com/

    • loloquwowndueo 23 hours ago
      Right to be forgotten - you can ask companies to delete data they hold on you.

      Data ownership/portability : you can ask companies for a copy of all data they hold on you or related to you.

      I’ve seen the latter used by job applicants to get an entire copy of their interviews, transcripts and assessments including the reason for not being hired.

    • saithir 20 hours ago
      It's really a wonder how every time gdpr is even remotely related, there's always gotta be someone complaining about how gdpr is at fault for the cookie/data prompts, and never that sites and advertising companies (and their 2137 partners) are at fault for actually making those prompts as annoying as possible in hopes that you just agree.
    • etothepii 23 hours ago
      In the UK open banking was essentially a response to GDPR this has allowed (to a limited extent) a variety of tools to be built on top of bank accounts that others would not have been.
    • akimbostrawman 19 hours ago
      It makes you aware a site is selling your data or is otherwise tracking you because otherwise they would not need a banner to request for consents to do so :)
    • throwaway270925 20 hours ago
      Since people still seem to conflate the two, let me say it loud and clear:

      GDPR HAS NOTHING TO DO WITH THE COOKIE PROMPTS!

    • pjc50 22 hours ago
      GDPR doesn't apply in the states, but hopefully it provides for some punishment for the poor security here for EU customers. Of course, then some Americans will get mad that a US company has to follow EU law.
      • bilekas 22 hours ago
        > Of course, then some Americans will get mad that a US company has to follow EU law.

        This is always the way of the world though, if you want to do business anywhere, you are of course obligated to follow the local laws and regulations. I don't see anyone disputing this outside of blatant patent infringement by certain countries.

      • ralferoo 22 hours ago
        The GDPR applies worldwide to any data held about EU or UK citizens, regardless of where they reside. It does apply in the US, it's just potentially harder for the EU to enforce meaningful penalties for infractions.
        • majorchord 15 hours ago
          > It does apply in the US

          EU law does not apply to US citizens residing in the US with no ties to the EU.

          • ralferoo 14 hours ago
            Correct. It does not apply to US citizens residing anywhere in the world. It does, however, as I said, apply to EU citizens regardless of where in the world they reside.

            If a company holds data about EU citizens, the GDPR applies to them, regardless of where that company is based. Including the US. Hence the statement "It (GDPR) does apply in the US" is completely correct.

            • quesera 9 hours ago
              It's written that way, perhaps.

              But there's no jurisdictional reality that any of country/union A's rights will protect a person while they are present in country/union B.

              In the same way that a US citizen does not have legal protection for free speech when present in, e.g. China, Saudi Arabia, or Germany.

              Even if the EU got the text incorporated into the UN Universal Declaration of Human Rights, there are famously many countries who are not signatories (and it would require a locally-implemented actual law to support its recognition).

              The EU can arrange post facto penalties for violations of their citizens' rights, to be (potentially) administered in the future, when a responsible entity enters EU jurisdiction, but absolutely not before then without cooperation by treaty with the nation where these foreign-and-not-real "rights" were violated. Which would be a surrender of sovereignty and basically unimaginable.

              (No comment on the goodness or successfulness of the GDPR here, just that no part of it is relevant outside of the EU regardless of how the text is composed.)

              (And this is all written with awareness that the US somehow manages to selectively enforce their laws extra-jurisdictionally in weak foreign nations. The EU is not the US, and the US is not weak.)

              • ralferoo 13 minutes ago
                Just, that is why I wrote "it's just potentially harder for the EU to enforce meaningful penalties for infractions."

                You premise is true in one sense, however, the point remains - the GDPR covers all EU citizens, regardless of where the company is based. For small US companies, sure the EU has very little power to enforce it, but larger companies that derive any revenue from the EU can be, and are, fined by the EU GDPR commissioners.

                There is more information here: https://www.gdpradvisor.co.uk/does-gdpr-affect-us-companies or here: https://www.clarip.com/data-privacy/gdpr-united-states/ or here: https://www.usitc.gov/publications/332/executive_briefings/g... or here: https://dataprivacymanager.net/5-biggest-gdpr-fines-so-far-2... (that last one, 16 of the 20 biggest fines were for companies outside the EU)

                I can't find the source, but Google's AI in the search results also claims that "EU GDPR fines for U.S. companies are significant, with U.S. firms facing roughly 83% of total GDPR fines, totaling over €4.68 billion by early 2025". That 83% figure seems unreasonably high to me, but it's possibly just a consequence of the size of the fine being based on worldwide revenue and over half of the 20 biggest fines were to Google and Meta.

  • esperent 22 hours ago
    This is actually a Fox News article and as far as I can see it's not corroborated anywhere.

    I saw a reddit thread about it earlier where someone said the apparent hacker refused to actually show any of the data and was asking for money. So probably just a scam rather than a real leak.

    • mapontosevenths 21 hours ago
      The Fox article just cites CyberNews.[0]

      Cybernews posts screenshots[1] featuring usernames like idmKYCCN and idmKYCFR, and the ports were locked down after contacting ID Merit.

      I think thay what's happened is that everyone is telling the literal truth and speaking very carefully to use that truth to obscure rather than inform. To hell with the victims. The way I intrerpet this is that their denials are both factually accurate AND misleading.

      The partner who said there is "no indication that any customer data has been compromised" is telling the literal truth. They can't find any indicators because they stink at logging and the screenshots posted on CyberNews obscure the customer info intentionally. Instead Cyber News only shows the IDM usernames in plaintext. Which was the responsible thing to do They literally cant see any indications... of customer data... because they dont have logs.

      It should also be noted that the Partners customer in this case is likely ID Merit... not the people whose information was stolen. So again, their statement was literally true even if they do find evidence of a billion records being leaked.

      Nobody should ever trust anyone involved in this again if I'm correct in this interpretation of the available facts.

      [0] https://www.foxnews.com/tech/1-billion-identity-records-expo...

      [1] https://cybernews.com/security/global-data-leak-exposes-bill...