Meta was watching you in your bathroom
Consent is being framed here as a technical fact: the user agreed. But consent requires understanding, and understanding requires an environment where you know what’s happening. That’s structurally absent. The man who leaves his Meta glasses on the nightstand doesn’t know his glasses are recording. His partner doesn’t know she’s being watched when she walks out of the bathroom. And an annotator in Nairobi knows exactly what he’s seeing, but has no choice but to keep clicking.
The structural picture points to a single mechanism: a contractor chain that enables data collection at scale while legal liability evaporates. “Designed for privacy” isn’t a description of a system. It’s a marketing product that deliberately contradicts how the system actually works. At the same time, the need for human annotation reveals a technical reality that rarely gets said out loud: models don’t learn on their own. They learn from labeled human work, and that work requires people to watch what users unknowingly capture. Opt-out as the default, rather than opt-in, isn’t an oversight within that setup. It’s a deliberate choice that keeps the data pipeline running. With facial recognition as the next step, that risk shifts from the person wearing the glasses to everyone around them.
What ultimately disappears here is the ability to know when you’re private. Not through a single breach, but through an environment that makes the boundary structurally invisible. The question isn’t whether users should pay closer attention. The question is what kind of society we become when paying attention is no longer enough because the technology is always on.