Meta hit with class-action lawsuit after Kenyan workers exposed to ‘intimate’ smart glass footage
- Marijan Hassan - Tech Journalist
- 20 hours ago
- 3 min read
Meta and its manufacturing partner, EssilorLuxottica, were hit with a major class-action lawsuit on March 4, 2026, following explosive allegations that human contractors have been viewing highly sensitive and private recordings captured by Ray-Ban Meta smart glasses.

The lawsuit, filed in the U.S. District Court for the Northern District of California, accuses the tech giant of false advertising and "egregious" privacy violations, claiming the devices are marketed as "privacy-safe" while secretly transmitting intimate footage to overseas third-party reviewers.
The legal action stems from a joint investigation by Swedish newspapers Svenska Dagbladet and Göteborgs-Posten, which uncovered the harrowing experiences of data annotators in Nairobi, Kenya.
The ‘Sama’ investigation: What workers saw
The lawsuit relies heavily on testimony from over 30 employees at Sama, a long-time Meta subcontractor. These workers are tasked with "labeling" footage to help Meta’s AI understand the world, but the content they reported seeing was far from mundane.
Workers reported being "distressed" after viewing footage of users showering, getting undressed, and engaging in sexual activity.
Annotators also described seeing high-resolution images of bank cards, PIN entries at ATMs, and sensitive tax documents that users presumably did not realize were being recorded or uploaded.
While Meta claims to use AI to automatically blur faces in training data, Kenyan workers testified that the technology is inconsistent. "Faces are often clearly recognizable," one worker told investigators, "especially in low-light home environments."
‘Designed for Privacy’: The false advertising claim
The core of the legal complaint focuses on Meta’s marketing slogan: "Designed for privacy, controlled by you."
Plaintiffs Gina Bartone and Mateo Canu argue that a reasonable consumer would never expect "controlled by you" to mean that footage from their bedroom or bathroom would be cataloged by human workers in Africa.
The always-ready trap
While the glasses only record when prompted (via tap or "Hey Meta"), the lawsuit alleges that the multimodal AI features, which require the camera to "look" at the world to answer questions, often capture background footage that users haven't explicitly saved. Worse still, the data is sent to Meta’s servers for "training."
Store misinformation
Swedish journalists visited several eyewear retailers and found that sales staff frequently gave false information, telling customers that data stays local to the device and is never shared with Meta.
Regulatory fallout: The UK ICO steps in
The controversy has quickly crossed the Atlantic, drawing the attention of European and British regulators.
The UK’s Information Commissioner’s Office (ICO) confirmed this week that it has written to Meta demanding an "urgent clarification" on how it filters sensitive data and whether UK citizens' private footage has been exported to Kenya for review.
Legal experts warn that if Meta is found to be exporting sensitive biometric or "special category" data (such as footage of sexual activity) to countries without adequate data protection laws, it could face fines exceeding 4% of its global turnover.
Meta’s defense: Standard practice
In a brief written statement, Meta defended its practices as essential for the development of advanced AI.
Meta points to its Terms of Service, which state that "interactions with AI" (including images and video shared for analysis) may be reviewed by humans to improve the system. A spokesperson stated that the company uses automated filters to block sensitive content from human view, though the lawsuit argues these filters are "categorically failing."
"Meta chose to make privacy the centerpiece of its marketing campaign while concealing the facts that reveal those promises to be false," the complaint stated. "This is a new frontier of surveillance where the home is no longer a sanctuary."












