Skip to content
Naked Security Naked Security

Facial recognition used to strip adult industry workers of anonymity

A name-and-shame database is supposed to "save" husbands from wives who have appeared on porn sites.

As if we don’t already have enough facial-recognition dystopia, someone’s claiming to have used the technology to match the faces of porn actresses to social media profiles in order to “help others check whether their girlfriends ever acted in those films.”

https://twitter.com/yiqinfu/status/1133215940936650754

… because what more noble thing could possibly be done with powerful FR technology, programming skills, an abundance of spare time, access to the internet and a lot of computing power?

He could, for example, use the technology to actually help people, instead of shaming them and exposing them to creeps and stalkers – for example, by identifying and (privately) notifying women who’ve been victimized by sextortionists, or revenge porn antagonists, or the operators of hidden webcams, or by partners who post intimate videos or images without their partners’ knowledge or permission, or who’ve never appeared in intimate photos or videos at all but who’ve still been convincingly depicted in DeepFakes ..?

Facial recognition could be used to inform such victims, so they could act to stop the privacy invasion and/or stalking or whatever harm it’s caused. But that’s not what this guy had in mind.

“More than 100,000 young ladies”

The programmer in question claims to have “successfully identified more than 100,000 young ladies” in the adult industry “on a global scale.” He’s a user of the Chinese social media app Weibo and claims to be based in Germany. He said that this is all legally kosher, given that sex work is legal in Germany and that he’s been keeping the data safely tucked away.

https://twitter.com/yiqinfu/status/1133229746341449730

Commenters noted that while sex work is indeed legal in Germany, there are strict laws about privacy that make what the he claims to be doing – and, mind you, these are just claims, as there’s no proof he’s done it at all – emphatically illegal.

Motherboard independently verified the translations of the Weibo user’s posts that were posted to Twitter by Yiqin Fu, a Stanford political science PhD candidate.

The posts have gone viral in both China on Weibo and in the US on Twitter and have given rise to heated discussion about the implications with regard to privacy, technology and misogyny.

Says you

As Motherboard points out, the OP hasn’t provided proof for his claims. He’s published neither code nor databases. The only thing he’s posted to prove his claims is a GitLab page… one that’s empty, mind you. When Motherboard contacted the user over Weibo chat, he told the publication that he’ll release “database schema” and “technical details” next week – no further comments.

Unfortunately, it’s all too easy to believe him. After all, the technology is accessible. Besides, if he did what he says he did, he’s not the first.

Three years ago, porn actresses and sex workers were being outed to friends and family by people using a Russian facial recognition service to strip them of anonymity. Users of an imageboard called Dvach in early April 2016 began to use the “FindFace” service to match explicit photos with images posted to the Russian version of Facebook: the social network VK (formerly known as Vkontakte).

The imageboard Dvach – Russian for 2chan – is called the Russian version of 4chan. Its popularity exploded after photographer Egor Tsvetkov showed how easy it was to deanonymize people by taking photos of riders on the metro and matching the images with their VK social network profiles.

His project was called “Your Face Is Big Data.”

Seeking to illustrate how invasive facial recognition can be, Tsvetkov photographed people sitting in front of him on the subway, then used FindFace to look for them in social networks.

Users could feed FindFace an image, whether taken on the metro or in whatever stalking situation presented itself, and the service matched it to public domain images and profile details in VK. Within three days of media coverage of Tsvetkov’s art project, Dvach users ran with it, launching the campaign to deanonymize actresses who appear in pornography.

For a short time, the doxing campaign included a community on VK that was created by Dvach users to upload links to files containing copies of women’s social media pages. It was intended to preserve information in the case of women deleting accounts or altering their privacy settings. The social network quickly banned the Dvach group following a complaint.

The “if they didn’t want it to be public…” line of thinking

Twitter is rife with commenters rolling out the oft-heard refrain: if they didn’t want this to be public, why did they [fill in the blank]?”

You would think that the growth of DeepFakes and surreptitiously taken images would silence the “why’d you take that photo/video in the first place” crowd, but there is a point to be made when it comes to publicly posted porn videos.

Perhaps it’s legal if the content is publicly posted. But that doesn’t make it ethical to use FR and/or AI to stitch together what has until recently been the separate public and private personas maintained by many porn stars in order to shield their privacy.

As the lawyer Jonathan Turley once said, it isn’t Big Brother knowing our every move we need to worry about. It’s Little Brother. As in, what we really need to worry about are the consequence of countless millions of cameras in the hands of private individuals. Turley:

We’ve got a Constitution to protect us against Big Brother. The problem is Little Brother. And there is not anything in the Constitution to protect us against them.

No, there’s not. But at the very least, we have the ability to set our Facebook photos and posts to private.

2 Comments

Perhaps I’m missing something, but wouldn’t anonymizing this data totally devalue it (putting all legal and ethical elements of the debate aside for the moment)? I suppose you could query the data for the name of an adult “actress” and have it say whether it had identified her or not without the private information shared, but of what interest would that be to people? I wouldn’t be surprised if his information “leaks out” at some point. The only real hope is that Germany’s strong privacy laws discourage him from doing so.

Reply

Not sure why this is news or surprising. Until humans have identical values and morals, people will always use technology in ways with which some won’t agree.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!