Skip to content
Naked Security Naked Security

AI fake porn could cast any of us

The faces of Daisy Ridley, Emma Watson, Gal Gadot and other celebrities have appeared near-seamlessly stitched onto porn videos using neural network tech.

In the case of revenge porn, people often ask: If the photos weren’t taken in the first place, how could ex-partners, or hackers who steal nude photos, post them?
Unfortunately, there’s now an answer to that rhetorical question. Forget fake news. We are now in the age of fake porn.
Fake, as in, famous people’s faces – or, for that matter, anybody’s face –  near-seamlessly stitched onto porn videos. As Motherboard reports, you can now find actress Jessica Alba’s face on porn performer Melanie Rios’ body, actress Daisy Ridley’s face on another porn performer’s body and Emma Watson’s face on an actress’s nude body, all on Celeb Jihad – a celebrity porn site that regularly posts celebrity nudes, including stolen/hacked ones.
Here’s Celeb Jihad crowing about a clip of a woman showering:

The never-before-seen video above is from my private collection, and appears to feature Emma Watson fully nude…

The word “appears” is key. It is, rather, an example what’s being called a deepfake.
Motherboard came across the “hobby” of swapping celebrities’ faces onto porn stars’ bodies in December, when it discovered a redditor named “deepfakes” who had made multiple convincing porn videos, including one of “Wonder Woman” star Gal Gadot apparently having sex with her stepbrother.
He also created porn videos with publicly available video footage of Maisie Williams, Scarlett Johansson, and Taylor Swift, among others. Deepfakes posted the hardcore porn videos to Reddit.
He used a machine learning algorithm, his home computer and open-source code.
At that point, the results were a little crude: for example, the image of Gadot’s face didn’t track correctly, Motherboard reported. You’d have to look closely to discern that it was a fake, though – at first blush, it’s quite believable.


Since then, production of AI-assisted fake porn has “exploded,” Motherboard reports. Thousands of people are doing it, and the results are ever more difficult to spot as fakes. You don’t need expertise with sophisticated artificial intelligence (AI) technologies at this point, either, given that another redditor – deepfakeapp – created an app named FakeApp that’s designed to be used by those without computer science training.

All the tools one needs to make these videos are free, readily available, and accompanied with instructions that walk novices through the process.

We haven’t seen anything yet. Coming weeks will bring deepfakes to all with the mere press of a button, deepfakeapp promised Motherboard in an email exchange.

I think the current version of the app is a good start, but I hope to streamline it even more in the coming days and weeks. Eventually, I want to improve it to the point where prospective users can simply select a video on their computer, download a neural network correlated to a certain face from a publicly available library, and swap the video with a different face with the press of one button.

From the subreddit’s wiki:

This app is intended to allow users to move through the full deepfake creation pipeline – creating training data, training a model, and creating fakes with that model – without the need to install Python and other dependencies or parse code.

As the news about deepfakes has spread to major news outlets, so too has condemnation of the rapidly advancing technology. Motherboard spoke with Deborah Johnson, Professor Emeritus of Applied Ethics at the University of Virginia’s school of engineering, who said that neural-network generated fake videos are undoubtedly going to get so good, it will be impossible to tell the difference between the AI produced fakes and the real thing.

You could argue that what’s new is the degree to which it can be done, or the believability, we’re getting to the point where we can’t distinguish what’s real – but then, we didn’t before. What is new is the fact that it’s now available to everybody, or will be… It’s destabilizing. The whole business of trust and reliability is undermined by this stuff.

One redditor took to the subreddit to address the “distain [sic] and ire” that’s arisen. Part of his argument was an admission that yes, the practice is a moral affront:

To those who condemn the practices of this community, we sympathize with you. What we do here isn’t wholesome or honorable, it’s derogatory, vulgar, and blindsiding to the women that deepfakes works on.

(Note that the technology would of course work on any gender: As Motherboard reports, one user named Z3ROCOOL22 combined video of Hitler with that of Argentina President Mauricio Macri.)
Gravity_Horse goes on to justify deepfake technology. One of his rationales: be thankful that this technology is in the hands of the good guys, not the creepy sextortionists:

While it might seem like the LEAST moral thing to be doing with this technology, I think most of us would rather it be in the hands of benign porn creators shaping the technology to become more focused on internet culture, rather than in the hands of malicious blackhats who would use this technology exclusively to manipulate, extort, and blackmail.

Gravity_Horse, are you serious? What in the world makes you think that a freely available app that’s truly God’s gift to sextortionists will be passed over by malicious actors?
Gravity_Horse also argues that this is nothing new. Photoshop has been around for years, he says, and it hasn’t resulted in courts being flooded with faked evidence or in the creation of believable revenge porn photos. Technology can create believable fakes, but it can also detect the forgeries, he argues.
Finally, Gravity_Horse argues that, ironically, as deepfakes proliferate and become mainstream…

Revenge porn actually becomes LESS effective, not more effective. If anything can be real, nothing is real. Even legitimate homemade sex movies used as revenge porn can be waved off as fakes as this system becomes more relevant.

Lord, I hope you’re right about that, because this genie is out of the bottle.


24 Comments

I sense a new business opportunity here – personalised porn, where anyone can star in their own porn movie. No, I won’t be signing up – I am just pointing out the inevitable next step. On the other hand, maybe if everyone does a pre-emptive strike with fake porn videos of themselves, it will devalue hacked genuine stuff and revenge porn to the point of killing them off. I guess fake porn videos could be used to help train AI filters to block all porn and porn could only be transmitted between consenting suppliers and receivers (porner and pornee, to coin terms).

Reply

Putting people’s heads on different people’s nude bodies has been around a very long time. The only thing new is new algorithms to try to make the image more believable. If someone tries to manipulate you by threatening to expose fake nude pictures of you, you just say they’re fake, and you can’t control what losers waste their time on.

Reply

This technology could usher in a new epidemic of porn addiction as companies pop up to provide personalized mix-and-match pornography, merging a paying addict’s favorite person (a mate or celebrity, for instance) with their favorite fetish.

Reply

While I would like to think this would cancel out revenge porn I can’t help but think about all the people that would have to suffer first before that even became real. I mean what if people started putting children’s faces on this stuff…..is it still against the law? Even if its fake by the time people realize it is how much damage will have been done by the time the truth came out? Not like there’s an app they’re building with this that can evaluate if its fake or not.

Reply

Exactly. Who’s going to be policing deepfakes to determine if children’s faces (which I imagine would be more credible as porn actors the closer they get to puberty, and beyond puberty, given how our faces begin to morph beyond having telltale aging signs [loss of baby fat, a thinning of soft facial tissue]) have been used? Who’s going to take the time to discern whether a deepfake “porn actor” is in actuality, say, 15 years old?

Reply

The legality would depend on the jurisdiction. In the UK for example you don’t need to be able to see the child with the human eye for it to be child porn. Anything that at any stage of production involved an image of a child is still an image of a child.
Though for this technology you need a head similar to the face being pasted onto it or the face will be distorted out of all recognition anyway.
Still, I am less concerned about whether a video from my past was sufficiently erased than I was a week ago.

Reply

Beyond the aspect of children being unwillingly cast as porn actors, there’s no age where it’s morally OK to appropriate people’s faces without their consent, save when done in the context of legally acquired mugshots.

Reply

Isn’t there a copyright issue hiding in all of this? This is not parody, people are mashing together two copyrighted works to make a derivative.

Reply

The only positive I see here is the admission from Gravity_Horse that what’s being done is amoral. I don’t understand why the arguments for the technology have to go beyond that, but if they do have to go beyond that, I’d like to see a thread on the subject of “and here’s how we’re going to ensure the technology is only used on willing subjects, and most certainly not on those who aren’t at the age of legal consent.”

Reply

Never mind fake porn – what about when any public figure can become an actor for the sake of fake news? Or can use the existence of such technology to deny events that really took place?

Reply

Way back last century many frat house fridges and noticeboards had head photos attached to magazine bodies..

Reply

It’s not so much the fact that the ability to glue the image of somebody’s head onto the image of another person’s body is new; it’s that AI is generating images that are far more difficult to discern as fakes, particularly when it’s live-action, with the deepfake moving in a way that matches the actor’s movements so well.
This is a whole ‘nuther ballgame when compared with the Monty Python-esque cut-and-paste stuff we’re used to seeing and to easily (more or less) spotting as fake.

Reply

It goes without saying that in a decade we’ll look back at this stuff and shake our heads about how anyone ever thought it was realistic.
People were horrified at the ultra-realistic Princess Diana fakes, which weren’t very good in hindsight.

Reply

People have been doing photoshop for awhile now. This really is not new even though an AI is doing it.Ever hear of an app for your phone called face swap? With my phone and this app I can take any picture of someones face and place it onto another. In most cases the app does a very good job of placing someone’s, anyone’s, face onto another. But you do not seem to think anything of that.
Plus, there are a ton of people that post pictures of themselves on the public internet. And what do you say about a really good photoshop pic?
This technology is really no different than photoshop, just the AI portion. Otherwise, it is photoshop for videos.

Reply

Lets leave it to the (real) porn industry to sort out the legaleze, as they have always done to date. Dont forget they are the bleeding edge (no pun). Think they’ll roll over for for some idiot’s head-fake on one of their stars? Not.
Unfortunately, at present, it is still legal.

Reply

Whether it’s legal or not is debatable: as others have mentioned, one imagines there are copyrights involved. Whether it’s against various sites’ policies against revenge/nonconsensual/involuntary pornography is clear, though, as we can see by the growing number of sites that are taking down deepfakes.

Reply

wait. No now that I think about it, its not whether my opinion matters, it could be my life thats at stake here my face in the wrong video could have dire consequences, and possible jail time. definetly not somethign I want.

Reply

Cameron, I think most people would instantly disagree with you about being in a Porno. For the simple fact that it could be extremely embarrassing, disrespectful and a violation of people lives and rights. so sorry to be a little disrepsectful here but you are very ignorant life isn’t about sex and drugs buddy. most of us have lives and jobs we worked very hard for. and being in a porno that could be potentially harmful to the live’s we’ve worked for, won’t settle very well with most people. :/

Reply

Cameron said: “as long as its not guy on guy so have it”
Oh, it probably will be. And therein lies the problem.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!