The most common follow-up question after we explain what FacePrivacy does: "Why doesn't Incogni / DeleteMe / Optery just add this?"
Fair question. The answer isn't laziness or capacity. The data broker industry and the facial recognition industry are different worlds — different vendors, different opt-out processes, different verification requirements, different laws, and different technical realities. Even a well-funded data broker service can't bolt face removal on without standing up a parallel operation.
This post explains why. Not as a sales pitch — as a description of how the two industries actually work.
1. Different vendors. No overlap.
DeleteMe, Optery, Aura, Kanary, Incogni, Privacy Duck — all of them work against essentially the same list of people-search and data broker companies. Whitepages. Spokeo. BeenVerified. Radaris. PeopleFinder. Intelius. MyLife. Plus a few hundred more depending on the service.
None of those companies are facial recognition databases. PimEyes, Precheck.ai, FaceCheck.id, Clearview AI, Corsight, EyeMatch, Social Catfish, FindClone — none of these are people-search sites that the data broker services even know about, much less submit removals to.
The vendor lists don't overlap. A removal pipeline built for Whitepages doesn't apply to PimEyes. The opt-out URLs are different. The contact methods are different. The data formats they accept are different. There's no shared infrastructure to extend.
2. Different opt-out processes.
People-search opt-outs follow a pattern: submit a name, address, and email; sometimes upload a redacted ID; wait 7–30 days; check if the listing disappeared. The whole flow is text-based.
Facial recognition opt-outs follow a different pattern: submit a high-resolution face photo and a government ID with the photo intact. The system is doing a face-to-face match between your selfie and your ID before they'll add you to an exclusion list. Your name and address are usually irrelevant.
3. Different verification requirements.
For a data broker, "is this you?" is answered with name + address + email + maybe a phone number. For a facial recognition database, it's answered by checking that the face in the submitted photo matches the face on the submitted government ID — and that the ID is valid.
That second flow involves biometric matching, document verification, and a multi-step review. PimEyes routinely rejects photos for "insufficient quality" or "face mismatch" — situations a data broker service has never encountered because they don't deal with photos at scale.
4. Different laws.
Both industries operate under privacy law, but the relevant statutes are different.
- Data brokers are governed primarily by general data privacy laws — California's CCPA/CPRA, EU GDPR's general provisions, plus state-level data broker registration laws (California, Vermont, Texas, others). The rights are usually around "data about me" generically.
- Facial recognition falls under biometric privacy — Illinois BIPA, Texas CUBI, Washington's biometric law, plus GDPR Article 9 (which classifies biometric data as a "special category" with stricter handling). The rights and procedures are specific to biometric data and often have stronger penalties.
A data broker removal service that knows GDPR Articles 15–17 inside out doesn't automatically know Article 9 case law or BIPA settlement history. Different legal expertise, different precedent.
5. Different technical realities.
Here's where it gets specific. Facial recognition databases don't just store names and metadata — they store face embeddings. A face embedding is a 128- to 512-dimensional vector that mathematically represents the face's distinctive features.
When you opt out, what you're asking the database to do is to add an "exclusion" entry that contains your embedding too — so future searches that produce a similar embedding return "no match." Doing that well requires actually running the same face recognition model on your submitted photo to extract the same kind of embedding the database uses.
This is a different technical operation than removing a row from a SQL table by name. It requires understanding how facial recognition systems represent identity at the model level. That's why the team at FacePrivacy comes from years of building facial recognition systems — same skill set, different application.
6. Different business models.
The data broker industry has hundreds of small vendors selling each other lists. Removal works because there's a regulatory framework (CCPA, GDPR) that gives consumers leverage and a relatively standardized opt-out process across vendors.
The facial recognition industry has fewer, larger vendors with more aggressive moats. PimEyes is in the Seychelles. Clearview AI is closely held. Many vendors operate primarily B2B (selling to law enforcement or enterprise) and don't have consumer opt-out processes at all. Some that do have them make those processes deliberately friction-heavy.
A consumer-focused data broker service has neither the leverage nor the workflow to handle this fragmented, friction-heavy landscape. We had to build the workflow specifically.
So what does this mean if you're already paying for one?
If you're paying for DeleteMe, Optery, Aura, Kanary, Privacy Duck, or Incogni — keep paying for it. They're doing real work in their lane. Removing your name and address from data brokers cuts off one significant attack vector (the "search by name" path).
Add FacePrivacy on top to cut off the other one (the "search by photo" path). The two services together cost roughly $20/month. Cheaper than a single Netflix-and-Spotify combo. More useful than either of those.
Add the face layer.
FacePrivacy submits removal requests to PimEyes, Precheck.ai, FaceCheck.id, Clearview AI, and other facial recognition databases on your behalf. Pairs cleanly with whichever data broker service you already use.
Start Removal →Code PRECHECK for 15% off your first month.