“We see everything from living rooms to naked bodies. Meta’s database has that kind of material,” an activist based in Kenya told Swedish newspapers Svenska Dagbladet and Goteborgs-Posten. “Maybe someone was walking around with glasses, or wearing them, and then that person’s partner was in the bathroom, or they came out completely naked.”
Meta is being sued for privacy violations after an investigation led by a Swedish newspaper found that photos taken by the company’s “smart glasses” were reviewed by overseas employees.
According to TechCrunch, the investigation found that workers at a subcontractor based in Kenya regularly viewed images and film captured on Meta’s artificial intelligence-powered glasses. The censored footage includes potentially sensitive material, such as personal nudity, people engaging in sex, and even using the toilet.
Meta says it blurs faces in images sent for review, but sources have denied the allegation.
The lawsuit, TechCrunch notes, was filed by plaintiffs Gina Bartone, a resident of New Jersey, and Matteo Cano of California. They are represented by attorneys at the Clarkson Law Firm and allege Meta has violated privacy laws and engaged in false advertising by misrepresenting its privacy practices.
For example, Meta AI smart glasses are commonly marketed using phrases like “designed for privacy, under your control” and “designed for your privacy.” Most users, therefore, would not expect intimate footage to be viewed or shown to anyone.
“We see everything from living rooms to naked bodies. Meta’s database has that kind of material,” an activist based in Kenya told Swedish newspapers Svenska Dagbladet and Goteborgs-Posten. “Maybe someone was walking around with glasses, or wearing them, and then that person’s partner was in the bathroom, or they came out completely naked.”
“People can record themselves incorrectly and not even know what they’re recording. They’re real people like you and me.”
Another employee told the paper that, while reviewing such footage may seem like a blatant invasion of privacy, they do it because they’re paid to do it.
“You shouldn’t question it,” said the worker. “If you start asking questions, you’re gone.”
The subcontractors also noted that they review other data collected by the smart glasses, including audio samples, that are used for processing and training purposes.
“It can be about any topic,” said one employee. “We see chats where someone talks about crimes or protests. It’s not just congratulations, it can be very dark things.”
The lawsuit argues that “no reasonable” consumer would buy Meta’s smart glasses if they knew the truth about the company’s practices.
“No reasonable consumer would interpret promises like ‘designed for privacy, controlled by you’ and ‘built for your privacy’ to mean that deeply personal footage from inside their homes would be viewed and cataloged by humanitarian workers overseas,” the lawsuit alleges. “Metta chose to make secrecy the centerpiece of its extensive marketing campaign, hiding the facts that reveal the falsity of these promises.”
Metta has since confirmed that it uses overseas contractors to review the collected material, but has not commented on allegations that its privacy protections — such as blurring faces — fall far short of what could or should be expected.
“When people share content with Meta AI, we sometimes use contractors to review that data for the purpose of improving people’s experience, as do many other companies,” a spokesperson said. “We take steps to filter this data to protect people’s privacy and prevent the review of identifying information.”
Sources
The California lawsuit accuses Meta of sending nude videos to workers through AI glasses.
Meta faces a class-action lawsuit over smart glasses privacy claims
Meta sues over privacy concerns over AI smart glasses after workers review nudity, sex and other footage
‘We see everything’: The report says footage from Meta’s AI smart glasses is reviewed by human contractors to see more than they bargained for, leading to a new lawsuit against the company.
