The era of facial recognition AI has a dirty secret, and a major cleanup just happened. Clarifai, a high-profile AI platform, recently deleted three million photos that it received from the dating site OkCupid. This was not a voluntary act of kindness. It was the result of a long-running federal investigation into how these two companies handled your private data. Along with the pictures, the company also wiped out any AI models that they built using that data. This move shows that the government is finally catching up to how tech companies have traded our faces like commodities for over a decade.
The story goes back to 2014. According to the Federal Trade Commission (FTC), Clarifai asked OkCupid executives for a “huge amount of awesome data” to train their systems. Some of those executives had actually invested money in Clarifai, creating a cozy relationship that left users in the dark. OkCupid handed over millions of user-uploaded photos, along with location data and demographic info. They did this despite their own privacy policy, which promised users that their data would stay safe. It was a massive betrayal of trust that remained hidden for years.
A Decade of Secret Surveillance
The public did not find out about this deal until a 2019 report from The New York Times. The investigation revealed that Clarifai used these dating profiles to build a tool that could guess a person’s age, sex, and race just by looking at their face. While the original data transfer happened twelve years ago, the fallout is only hitting now. The FTC and Match Group, which owns OkCupid, settled a lawsuit last month. Match Group never admitted they lied to users, but the fact that Clarifai had to delete the photos tells the real story.
The FTC also accused Match Group and OkCupid of active deception. They allegedly hid what they were doing and tried to block the government’s investigation. Because this was a first-time offense, the FTC could not actually fine the companies. Instead, they issued a permanent ban. OkCupid and Match are now legally prohibited from lying about how they collect or share data. If they get caught doing this again, the consequences will be much more severe.
Why Your Data Safety Matters
This case is a wake-up call for anyone with an online profile. For years, AI startups have treated the internet like a free buffet of training data. They assumed that if you uploaded a photo, it was fair game for their algorithms. This mindset led to the creation of powerful surveillance tools built on the backs of people who were just trying to find a date or share a vacation photo. When companies trade your data behind your back, they aren’t just selling your info. They are selling your identity.
The deletion of these three million photos is a small win for privacy, but the battle is far from over. AI models are getting hungrier for data every day. As we move forward, we need to demand more transparency from the apps we use. If a company says they won’t share your data, they should mean it. The “Wild West” days of AI training are ending, and the “Great Dating Deception” will serve as a warning to every other tech firm trying to take a shortcut with our private lives.

