Skip to content
Naked Security Naked Security

Neural face recognition network tuned with 650,000 pornstar images

But did anyone ask the actors - all women - for consent to use their images?

A facial recognition startup called Pornstar.ID, a reverse-image lookup for identifying porn actors, has trained its neural network on upwards of 650,000 images of more than 7,000 female adult performers.

NSFW: If you really want to find out who this adult film performer is in the image below, maybe wait until you’re not in the office. Neither the Pornstar.ID site nor this tweet’s links are safe for work.

No, I didn’t know who that adult film performer was before Pornstar.ID provided her profile. Now I do know, at least, her stage name.

But I still don’t know the answers to a few questions that are more pressing from a privacy point of view, such as, did those performers consent to being identified and listed on the Pornstar.ID site? Did they agree to having their biometrics scanned so as to train a neural network? Is there any law that says their published images, which are presumably published online for all to see (or purchase) aren’t up for grabs for the purpose of training facial recognition deep learning algorithms?

As a matter of fact, there are such laws concerning face recognition. The Electronic Privacy Information Center (EPIC) considers the strongest of them to be the Illinois Biometric Information Privacy Act, which prohibits the use of biometric recognition technologies without consent.

In fact, much of the world has banned face recognition software, EPIC points out. In one instance, under pressure from Ireland’s Data Protection Commissioner, Facebook disabled facial recognition in Europe: recognition it was doing without user consent.

So yes, depending on where you live, there are laws against facial recognition without consent. It’s not clear whether Pornstar.ID’s use of facial scanning falls foul of these laws.

Users can upload photos to Pornstar.ID’s site or send an image via tweet to have Pornstar.ID identify a woman performer (for whatever reason, Pornstar.ID is only focusing on women, not male adult film stars). If Pornstar.ID can’t figure out who the performer is, it will suggest a list of similar-looking women.

Pornstar.ID creator Mike Conrad told Digital Trends that the app was a “no-brainer” given the pairing of technology advancement with the eagerness of fans to identify stars:

Together with the huge steps forward with facial recognition technology in the past few years, and the increasing demand for identification of people in adult movies, the launch of this project was a no-brainer.

Conrad said the neural network is still in beta mode. It’s only working from still images at this point, but there are plans to enable it to recognize faces in videos. This month, the team plans to launch an improved version.

There are plans to open up the current API to third parties, “such as tube websites,” with video support, for “automatic tagging of the adult performers”.

(According to a Quora user who describes herself as an adult film star, a site owner and producer, “tube” is shorthand for a porn site with an interface similar to that of YouTube, with user-uploaded videos, a system of likes/dislikes, view counter, and comments.)

Automatic identification of individuals simply by watching videos of them: is that an exciting use of facial recognition, or a terrifying one? I guess the answer depends on whether you’re a fan or somebody whose faceprint is being used.

From the perspective of the performers, this might be cause for concern.

Last year, we saw sex workers stripped of anonymity in Russia, thanks to the use of a facial recognition service – FindFace – to match explicit photos with images posted to the Russian version of Facebook: the social network VK (formerly known as Vkontakte).

Its popularity exploded after photographer Egor Tsvetkov showed how easy it was to deanonymize people by taking photos of riders on the metro and matching the images with their VK social network profiles. His project was called “Your Face Is Big Data”.

Seeking to illustrate how invasive facial recognition can be, Tsvetkov photographed people sitting in front of him on the subway, then used FindFace to look for them in social networks.

Similar to Pornstar.ID, users could feed FindFace an image, whether taken on the metro or in whatever stalking situation presents itself, and the service matched it to public domain images and profile details in VK.

The content in question wasn’t marked private.

Tsvetkov’s project successfully matched the slumped-over, bundled-up people whose images he captured on the subway with their much shinier, occasionally salacious social media selves.

Will Pornstar.ID be similarly used to identify sex workers, even if they’re just walking down the street or grabbing a cup of coffee in a Starbucks? Could it someday be used to link to their social media profiles so fans or stalkers can find their real names, where they live, or whatever other details they’ve made publicly available (another example of why it’s such a good idea to button up the personal information we share on social media)?

I’ve reached out to Pornstar.ID to ask those questions, as well as the rationale for the women-only focus. I’ll update the story if I hear back.


7 Comments

Important questions you ask there. It saddens me, that people are so easy with the use of other people’s data. It’s almost unbearable when it’s about something as sensitive as sex work. Empathy is lost.

Reply

It’s all very sad really. “… the increasing demand for identification of people in adult movies, the launch of this project was a no-brainer”. I mean, is this really where we are as a society?

Reply

Could be a big problem for doppelgangers, who don’t work in the adult film industry but wind up getting identified as such.

Reply

That’s already happening.
There’s a “fake news” video doing the rounds claiming that the same woman has appeared at 4 newsworthy events on NBC as a “proof of crisis actors” – one of the women clearly has a different nose, but the Alt-Right seem to think, like “pizzagate”, that a hundred bits of evidence are fake because of one tenuous connection.

This? This isn’t going to help that, not at all.

Reply

Well, on a positive side for “content creators”, it might make it easer for them to find copyrighted videos of them that where uploaded by random people.

Reply

There is much less invasive software that does that already, check out what happened with Vizio recently, video and images tend to be easily “fingerprintable” if you know what you’re doing (or just download the appropriate software libraries). Internet Image Board websites use a similar method (to what Vizio did) to reduce the number of duplicate images on their sites. Google Image Search likely uses that same fingerprinting process to help with their blocked search results (it makes sense to do so, saves a lot of time, effort & code if you can just get a high-degree-accuracy wildcard).

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!