Meta is facing a class action lawsuit in the US over privacy practices tied to its Ray-Ban Meta smart glasses. The allegation is straightforward and damning: private camera footage captured by the glasses was sent to a third-party subcontractor in Kenya for manual review as part of AI training.
Let me say that again. You're wearing glasses with a camera. You record something, maybe your kid, maybe your living room, maybe a private conversation. That footage gets sent to a person in Kenya who watches it, labels it, and feeds it into Meta's AI models. You didn't know this was happening.
This was always going to happen. We all knew it. And we collectively chose not to think about it.

The elephant in every pair of smart glasses
Camera-equipped smart glasses have had a privacy problem since the day Google Glass made everyone uncomfortable at restaurants in 2013. The social contract around cameras has always been based on visibility. You can see when someone pulls out a phone. You can see a security camera on the wall. A camera on someone's face that looks like regular glasses is fundamentally different. People around you don't know they're being recorded.
Meta's Ray-Ban glasses sold over 7 million units. That's 7 million cameras walking around in public, in homes, in workplaces, on faces that look like they're just wearing sunglasses. The recording indicator is a tiny LED that most people never notice.
The lawsuit doesn't allege that Meta is secretly recording people without the wearer's knowledge. It alleges that what happens to the footage after it's captured isn't what users expected. You might use the glasses to take a photo of a restaurant menu or record a quick video of your dog. You probably didn't read the terms of service closely enough to know that footage could be reviewed by human contractors in another country.

The AI training problem
This is the uncomfortable truth about every AI-powered device. AI models need training data. Training data needs labels. Labels need human reviewers. Those reviewers see your stuff.
Apple went through this with Siri recordings. Amazon went through it with Alexa. Google went through it with Assistant. Every time, the response was shock that real humans were listening to recordings, followed by apologies and policy changes, followed by everyone forgetting about it until the next incident.
Smart glasses make this problem worse because the data they capture is visual, not just audio. Audio snippets of someone asking about the weather are one thing. Video footage of someone's home, their family, their private spaces is categorically different. The sensitivity of visual data captured from a first-person perspective on someone's face is on another level entirely.
Meta's position
Meta hasn't publicly commented on the specifics of the lawsuit at the time of writing. Their general privacy stance with Ray-Ban Meta has been that data collection is disclosed in their privacy policy and users can control what data is shared.
The problem with that defense is that nobody reads privacy policies. That's not a user failing. It's a design choice. When the privacy implications of a product are buried in a legal document that 99% of users will never read, the company is making a deliberate decision to prioritize adoption over informed consent.
7 million users. How many of them understood that their footage could be manually reviewed by human contractors? I'd bet the number is close to zero.

The broader implications for smart glasses
This lawsuit isn't just about Meta. It's about the entire smart glasses industry. Samsung is launching AR glasses this year. Google has Android XR glasses coming. Snap is releasing consumer Spectacles. Apple is reportedly working on smart glasses for 2027. Every single one of these devices will likely have cameras.
The question every company in the smart glasses space needs to answer is: what happens to the data these cameras capture? Is it processed entirely on-device? Is it sent to the cloud? Can third parties access it? Is it used for AI training? Can users genuinely control and delete their data?
These aren't hypothetical questions anymore. There's a class action lawsuit in progress that says Meta's answers to these questions weren't good enough. Every other company building camera glasses should be paying very close attention.
What I think
I like smart glasses. I think they're the future of personal computing. The idea of having AI assistance available through something as lightweight as a pair of glasses is genuinely exciting. But that future only works if people trust the devices on their faces.
Trust isn't built by burying data practices in legal documents. It's built by being transparent about what data is collected, where it goes, who sees it, and giving users real control over all of that. Not a settings menu that defaults to maximum data collection. Real, visible, obvious control.
Meta has sold 7 million Ray-Ban smart glasses. That's an incredible achievement. But if the price of that success is a privacy reckoning that poisons public trust in the entire category, it won't matter how many units shipped. The entire smart glasses industry needs to get ahead of this before the backlash sets in.
The lawsuit is just the beginning.
