Good news for those who like to baffle facial recognition but need to do it on a tight budget: researchers have come up with glasses that effectively screw with the algorithms, yet don’t cost a fortune.
Not only that, but you can use them to impersonate celebrities – at least, as far as the systems’ artificial intelligence algorithms are concerned.
Researchers at Carnegie Mellon printed the eyeglass frames for about 22 cents each, as they described in a recent paper (PDF).
Compare that with Privacy Visor: a bulky facewrap, the early iterations of which were encrusted with LEDs that scattered the facial shadows that would otherwise be used by facial recognition algorithms to detect things like noses and eyes.
Privacy Visor was scheduled to debut in the spring for a cool $240.
Carnegie Mellon’s 22-cent frames may be a whole lot cheaper, but on what some of us may see as the downside, they lack the ability to make you look like you have the multifaceted eyes of a housefly.
The Carnegie Mellon frames could, in contrast, pass for normal eyeglasses. They enabled wearers not only to obscure their identity but to impersonate celebrities.
One subject wearing the frames – one of the researchers who’s a white male – managed to impersonate the white, female actress Milla Jovovich 87.87% of the time.
His colleague, a South Asian female, could impersonate a Middle Eastern male 88% of the time.
Another researcher, a Middle Eastern male, didn’t have such luck: he managed to impersonate Clive Owen, a white male actor, only 16.13% of the time.
As you can see in the photos the researchers shared in a post on the Prosthetic Knowledge blog, the frames look like crudely cut-out cardboard festooned with fuzzy, pixellated prints in lurid neon colors.
They could get far more slick , though. Mahmood Sharif, co-creator of the glasses, told New Scientist that the same principles of tinkering with the pixel colors that facial recognition algorithms use to piece together a guess of who’s in a photo could be replicated in a normal tortoiseshell pattern.
That means that the printed frames could trick computers, without alerting humans that the biometric recognition systems were being foiled.
That, in fact, is what the researchers were really interested in: not cheap means to protect your privacy against surveillance, but attacks on facial recognition systems.
Those averse to having their face recognized and stored away in growing biometrics databases curated by the likes of Facebook and US law enforcement likely see facial-recognition-thwarting advances such as these printed frames in a positive light.
As it is, local law enforcement are using it in secret, a sports stadium has used it to try to detect criminals at the Super Bowl, retail stores are tracking us with it, and even churches are using it to track attendance.
Researchers have even demonstrated that algorithms can be trained to identify people by matching previously observed patterns around their heads and bodies, even when their faces are blurred or obscured.
But while fooling facial recognition may protect individuals’ privacy against this increasingly ubiquitous surveillance, it can also allow criminals to evade surveillance, Sharif pointed out.
As far as protecting privacy goes, these printed frames come at an auspicious time.
In an ongoing legal battle over use of facial recognition, Facebook has said that users can’t stop it from collecting biometric data.
Since April 2015, it’s been wrangling over the matter in a class action lawsuit. The lawsuit claims that Facebook violates a unique Illinois law that carries fines of $1,000 to $5,000 each time a person’s image is used without permission.
Facebook, of course, isn’t asking for anybody’s permission, and it’s not looking good for all those who’ve had their mugs loved up without permission, given that there doesn’t appear to be any way to prove what the courts call “concrete injury”.
As in, were you denied a job because you were recognized in a photo depicting you doing something frisky? A class action would result in a legally untenable sea of individualized stories like that, Facebook says.
Facebook has already been forced to strip facial recognition out of its facial recognition-powered photo app Moments, in Europe. But so far, it’s been grazing free-range with biometrics in the US.
The latest twist in this class action suit could help illustrate, yet again, the stark difference between Europe and the US with regards to data privacy.
The case could have wide repercussions, regardless of how it plays out. As Chicago Business frames it, if judges side with facial recognition giants like Facebook and Google, they could kill off any lawsuits filed under the Illinois law.
If the courts disagree with the companies, we could see new restrictions on using biometrics in the US – maybe even on par with those in Europe.
We’ll let you know how it plays out. But in the meantime, if you’re concerned with your privacy, you could do worse than to start sporting some glam glasses.
expatmatt
I would expect a 100% success rate impersonating Elton John while wearing Carnegie Mellon’s glasses
2072
This method is probably easy to thwart for AI developers once it is known to exist…
Akshay Kumar
Thats a good case study about glasses