← Blog
· 5 min read

Meta wants to put a name tag on you

A leaked memo. A new rollout from Ring. A settlement that took eleven years. None of these stories are about us — but they're all, in different ways, the same story.

Meta Ray-Ban smart glasses on display in a retail store, with marketing screen reading 'Smart glasses, designed for the way you live.'
Meta x Ray-Ban smart glasses on display. The "name tag" facial recognition feature would let wearers identify strangers on sight.

A few weeks ago, an internal Meta memo leaked to the New York Times. The plan: add facial recognition to their Ray-Ban and Oakley AI glasses so the wearer can look at a stranger and see who they are. Meta calls it a "name tag" feature.

Stop and picture that for a second. Someone walks past you on the street, glances over for a beat too long, and now they know your name. Maybe your job. Maybe whatever else they decide to look up next. You did nothing. You wore a face.

This is what we mean when we say facial recognition is not a technology you can opt into or out of in the moment. It happens to you. By the time you'd say "no thanks," the scan is done and the match is sitting on someone else's screen.

We started FacePrivacy because the legal frameworks people assume will protect them (GDPR, CCPA, state biometric laws) only work if somebody actually files the requests. The right to be forgotten isn't automatic. It's a form. It's nine forms, in different formats, in different jurisdictions, that have to be re-filed every time your face gets re-scraped. Most people don't do it. The databases count on that.

This wasn't an isolated week

The Meta news broke alongside a few other things you should know about:

Surveillance

Amazon Ring's "Familiar Faces" rollout

Senator Markey's office disclosed last week that Ring's new facial recognition feature scans every face in front of every Ring doorbell. Device owners get privacy controls. Delivery drivers, kids walking past, and neighbors don't.

Settlement

The OkCupid / Clarifai settlement

The FTC just confirmed Match Group shared roughly three million OkCupid users' photos with the facial recognition firm Clarifai back in 2014 without consent. Clarifai used them as training data. Eleven years later, the company is finally agreeing to delete the models.

Wrongful Arrest

Kimberlee Williams

Wrongfully arrested by Maryland police based on a bad facial recognition match. Six months in jail. The fourteenth case of this kind that's been made public.

None of these stories are about FacePrivacy. We're not on any of them. But they're all, in different ways, the same story: your face is being used in systems you didn't consent to, by people who don't know you, with consequences you can't predict and can't easily reverse.

What we actually do (and don't)

We won't tell you we can stop Meta from shipping name tag glasses. We can't. We're not going to pretend a subscription to our service will keep your face out of every camera in every airport. It won't.

What we do is narrower than that, and more useful. We submit and re-submit removal requests on your behalf to the facial recognition databases that honor opt-outs:

Clearview AI PimEyes Precheck.ai FaceCheck.id Corsight Social Catfish ClarityCheck EyeMatch FindClone …and others

We use the legal rights you already have under GDPR and CCPA. When your face reappears in those databases (which it does, because new scrapes keep happening), we file again. And we offer a way to alter the facial embeddings on photos you upload going forward, so future training runs and search engines don't get a clean match. The pictures still look like you. The math underneath stops cooperating.

That's it. No magic. Just paperwork done at scale, and one technical trick that buys you a little ground on new uploads.

Why we think this year matters

The arguments for waiting are getting weaker. Regulation is real, but slow. The EU AI Act has teeth, but enforcement scales with budgets and political will. State laws are a patchwork. Federal action in the US, on a timeline that would actually matter, isn't happening.

Meanwhile, every quarter the consumer rollout of facial recognition gets one notch more aggressive. Last year it was Faceboarding at Linate. This year it's Ring on every porch and Meta on every bridge of every nose at brunch. Whatever your mental model of "places where I get scanned" was eighteen months ago, it's already out of date.

If you're reading this on our site, you probably don't need the convincing part. You need the practical part.

So: search yourself on PimEyes and Precheck.ai first, before doing anything else. See what's there. Then decide how much paperwork you want to do alone, and how much you want us to do for you.

Start Removal →

We'll be here either way.