Skip to content
Naked Security Naked Security

Clearview AI facial recogition sued again – this time by ACLU

Clearview AI, the company that's scraped billions of images to build a facial recognition system, is getting sued again.

The facial recognition company that everyone – or at least a large chunk of everyone – loves to hate, Clearview AI, is to get yet another day, and perhaps very much longer than that, in a Chicago courtroom.
The American Civil Liberties Union (ACLU), together with four community organisations based in the US state of Illinois, has brought a civil suit that states as its purpose to “put a stop to [Clearview’s] unlawful surreptitious capture and storage of millions of Illinoisans’ sensitive biometric identifiers”, and to “to remedy an extraordinary and unprecedented violation of Illinois residents’ privacy rights”.
Clearview already faced a class-action lawsuit filed in January, also in Illinois, for collecting biometric identifiers without consent.
Illinois is home not only to Chicago, the third-biggest metropolis in the US, but also to a state law called BIPA, short for Biometric Information Privacy Act, which legislates the USA’s strictest legal control over the collection and use of biometric data.


In case you missed it, Clearview AI pitches itself, on its corporate website, as offering “computer vision for a safer world”, and describes its services as follows:

Clearview AI is a new research tool used by law enforcement agencies to identify perpetrators and victims of crimes.
Clearview AI’s technology has helped law enforcement track down hundreds of at-large criminals, including pedophiles, terrorists and sex traffickers. It is also used to help exonerate the innocent and identify the victims of crimes including child sex abuse and financial fraud.
Using Clearview AI, law enforcement is able to catch the most dangerous criminals, solve the toughest cold cases and make communities safer, especially the most vulnerable among us.

In simple terms, Clearview trawls the internet for publicly available images of people, notably images that are already tagged in some way that identifies the people in the picture, and builds an ever-burgeoning index that can map faces back to names.
Loosely speaking, it tracks down pictures of you that are already available online, for example as snapshots on a social network, then analyses your face in those images to create what amounts to a faceprint.
It then combines that faceprint with existing data against your name to refine and improve its accuracy in matching untagged images against existing, tagged ones.
For example, if you’re a police officer, you might put in a picture of a “person of interest” whom you snapped during a lawful surveillance operation, and Clearview would try to find out who it was and to point you in the direction of useful intelligence about them.
Interestingly, the company wasn’t created specifically to address law enforcement needs or desires, but rather to create a massively scalable facial recognition system and then find a market for it.
In a CNN interview with Clearview founder Hoan Ton-That published earlier this year, Ton-That talks about how the company…

…spent about two years really perfecting the technology and the accuracy and the raw facial recognition technology accuracy. […] We have to search out of billions and billions of photos and still provide an accurate match. [… T]he second part of it was, ‘What’s the best application of this technology?’ And we found that law enforcement was such a great use case.

Ton-That admitted, in that interview from March 2020, that the company did have customers in the private sector, but claimed that all of them were “trained investigators.”
But that stance has changed recently, due to the class-action lawsuit mentioned above, with the company apparently agreeing in May 2020 that it would not sell to private entities any more, and would not sell its services at all in Illinois.

Is it legal?

As you can probably imagine, Clearview’s attitude is that the data it is searching and acquiring is already public – it doesn’t use pictures that are uploaded for private access only – and so all it is doing is creating its own index of images that anyone who wanted to could find and peruse on their own.
Following that argument, you could choose to treat Clearview as a special case of a search engine, many of which provide their own reverse image search features, apparently without the same level of legal controversy.
On that basis, you might conclude that Clearview is being penalised simply for being technologically successful and “building a better mousetrap.”
But the ACLU and its fellow plaintiffs in this latest legal suit disagree, arguing that Clearview isn’t just collating public images but scraping them in vast quantities and then converting them into faceprints rather than merely storing them as images.
Faceprints, says the ACLU, are not only biometrics as far as the public is concerned, but also identified explicitly as biometric identifiers in the Illinois BIPA act.
Interestingly, BIPA excludes photographs themselves from its list of biometric identifiers – it’s the processing of the photograhps to extract a “recogniser” that makes the data biometric:

In [the Biometric Information Privacy] Act:
“Biometric identifier” means a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry. Biometric identifiers do not include writing samples, written signatures, photographs, human biological samples used for valid scientific testing or screening, demographic data, tattoo descriptions, or physical descriptions such as height, weight, hair color, or eye color. […]

BIPA requires not only that anyone whose photos get “biometricised” be asked for written consent up front, but also that they have a right to know how their data will be used, and by whom, and when it will be deleted.
It’s hard to see how Clearview could ever practicably comply with the consent part of this law, if the court deems that the ACLU is right. (It also feels likely that more and more US states, and even the Federal Government, might enact similar legislation in the near future.)
It also seems unlikely that enough people would ever give their consent to make the service work usefully if the court rules that consent is always and explicity necessary. (Would you consent? Let us know, and why, in the comments below.)
You’d also think that purging faceprints from the database – and, presumably, retraining the entire system every time an entry is deleted to ensure that the withdrawn faceprint no longer affects the search results – might end up taking more effort than adding new faceprints that were acquired with consent.

Yet bigger foes

What makes life even tougher for Clearview right now is that it also faces a group of opponents that are perhap an even bigger foe to its business model than the Circuit Court of Cook County, Illinois.
Earlier this year, Twitter, Facebook (which owns Instagram) and Google (which owns YouTube) told Clearview that images on their sites weren’t there for scraping.
It seems that those three social media behmoths aren’t just objecting to the fact that their terms and conditions don’t allow third parties to scrape their data for commercial purposes.
Even if Clearview were to offer to pay a fair market price for the right to scrape images from Facebook, Twitter and Google in order to get the contractual angle in order, it looks as though those companies would refuse anyway.
We’d imagine that images from those three companies alone account for a lot of the accuracy in Clearview’s service.

What to do?

There’s an elephant in the room in this story, of course.
The problem here is that even if Clearview loses this case and ends not using scraped images at all, those images nevertheless remain publicly accessible, and are prone to scraping anyway by companies or groups that, unlike Clearview, will not seek to publicise their work.
Simply put, even if these lawsuits end up establishing that you have an implicit and irrevocable expectation of privacy when you upload pictures online, and even if there are legal precedents that successfully inhibit companies openly using your published images for facial recognition puposes…
…those legalisms won’t actually stop your photos getting scraped, in just the same way that laws criminalising the use of malware almost everywhere in the world haven’t put an end to malware attacks.
So we have to end with two pieces of advice that, admittedly, make today’s internet a bit less fun than you might like it to be:

  • If in doubt, don’t give it out. By all means publish photos of yourself, but be thoughtful and sparing about quite how much you give away about yourself and your lifestyle when you do. Assume they will get scraped whatever the law says. and assmue someone will try to misuse that data if they can.
  • Don’t upload data about your friends without permission. It feels a bit boring, but it’s the right thing to do – ask everyone in the photo if they mind you uploading it, ideally before you even take it. Even if you’re legally in the right to upload the photo because you took it, respect others’ privacy as you hope they’ll respect yours.

Let’s aim for a truly opt-in online future, where nothing to do with privacy is taken for granted, and every picture that’s uploaded has the consent of those in it.


Latest Naked Security podcast

LISTEN NOW

Click-and-drag on the soundwaves below to skip to any point in the podcast. You can also listen directly on Soundcloud.

4 Comments

“Let’s aim for a truly opt-in online future, where nothing to do with privacy is taken for granted, and every picture that’s uploaded has the consent of those in it.”
I couldn’t agree more Paul. But in the real world that is never going to happen unfortunately.

Reply

With the implementation of using facial recognition for MFA, I would say that this could be a bigger issue than any realizes. Using the bio-metric data for identification, someone maybe able take the data and reverse engineer a face or maybe close enough to fool facial recognition security.

Reply

There’s a larger window at issue that needs to be addressed in this case as well.
If every picture uploaded should have the consent of whomever is in the picture, then does that mean that parents can no longer post pictures of their children online? Until they are whatever the age of majority is in their muncipality, the children do not have a right to say yes or no.
This is one point that has always bothered me, as a parent and as someone who works in the tech field and is aware of these issues, is parents posting their children’s’ pictures online without regard to how those photos will follow them for all eternity. Yes, it’s cute what little Johnny was doing today in the back yard, but 25 years later that picture will show up somewhere unexpected when they least expect it, and they had no say in the posting of the photo or how it was used, and by whom.
How this relates to this article and issue is mom and dad think it’s cute, post it online for everyone to see because they don’t understand privacy settings, it gets scraped up by a company like Clearview and “biometricised”. From that point on, every time mom or dad or grandma or Aunt Lil posts another public picture of little Johnny online, it can get scraped up and added to the collection under ‘little Johnny’ and viola – a time-lapse aging photo montage of little Johnny. Fast-forward 15-20 years, and little Johnny is not so little any longer, and decides he wants to go into a job in law enforcement. He does well working up through the ranks, but then applies for a position working in undercover work. A background check is done, and little Johnny can’t get the promotion and job he’s been working for over several years because of all those pictures over the years of him that were posted online, and he’s not an ‘unknown’ that can work undercover; he’s a verifiable law enforcement officer and his dreams go up in smoke.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!