The home of deepfakes – videos with celebrities’ or civilians’ faces stitched in via artificial intelligence (AI) – has kicked them out.
After keeping mum while the issue exploded over the past few weeks, on Wednesday Reddit banned deepfakes.
Or, rather, Reddit clarified that its existing ban on involuntary pornography, which features people whose permission has neither been sought nor granted, covers faked videos.
Up until Wednesday, Reddit had a single policy for two rule breakers: involuntary pornography and sexual or suggestive content involving minors. It’s split that single policy into two distinct policies, setting the involuntary pornography ban to stand on its own and adding language that specifies deepfakes:
Reddit prohibits the dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked.
The platforms that put up with the involuntary induction of people into becoming porn stars are steadily dwindling in number: both Twitter and the giant pornography site Pornhub have also banned deepfakes, likewise calling them nonconsensual porn that violates their terms of service (TOS).
From an email statement a Pornhub spokesperson sent to Motherboard:
We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it. Nonconsensual content directly violates our TOS and consists of content such as revenge porn, deepfakes or anything published without a person’s consent or permission.
Pornhub previously told Mashable that it’s already taken down deepfakes flagged by users.
Corey Price, PornHub’s vice president, told Mashable that users have started to flag deepfakes and that the platform is taking them down as soon as it encounters the flags. Price encouraged anyone who finds nonconsensual porn to visit Pornhub’s content removal page to lodge an official takedown request.
As for Twitter, a spokesperson said that the platform isn’t just banning deepfakes; it’s also going to suspend user accounts of those identified as original posters of nonconsensual porn or accounts dedicated to posting the content.
You may not post or share intimate photos or videos of someone that were produced or distributed without their consent.
We will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subject’s consent. We will also suspend any account dedicated to posting this type of content.
Reddit, Twitter and Pornhub join earlier deepfake bans by chat service Discord and Gfycat, a former favorite for posting deepfakes from the Reddit crowd. Reddit is where the phenomenon first gained steam. Make that a LOT of steam: the r/deepfakes subreddit had over 90,000 subscribers as of Wednesday morning before it was taken down.
It’s not only celebrities who can be cast as porn stars, of course, and it’s not just porn that’s being fabricated. We’ve seen deepfakes that cast Hitler as Argentina President Mauricio Macri, plus plenty of deepfakes featuring President Trump’s face on Hillary Clinton’s head.
As far as deepfakes legality goes, Charles Duan, associate director of tech and innovation policy at the advocacy group R Street Institute thinktank, told Motherboard that the videos infringe on copyrights of both the porn performers whose bodies are used in the underlying videos and the celebrities whose faces, taken from interviews or copyrighted publicity photos, are glued onto those bodies. These groups could seek protection by filing a Digital Millennium Copyright Act (DMCA) report to initiate a takedown notice.
But he says it’s the video makers whose work is being appropriated, not the celebrities, who would have the solid copyright claim:
The makers of the video would have a perfectly legitimate copyright property case against the uploader, and they would be able to take advantage of the DMCA to have the website take the vid down. Chances are, a similar practice would work as well for these sorts of videos … Not on behalf of the victim [whose face is being used] but the maker of the video. Which is the weird thing about that whole situation.
Reddit said that this is about making Reddit a “more welcoming environment for all users.”
Earlier on Wednesday, before the community was taken down, r/deepfakes moderators had informed subscribers that deepfakes featuring the faces of minors would be banned without warning and that the posters would be reported to admins. The subreddit’s rules had also included a ban on using the faces of non-celebrities.
Good! But too little, too late. Given that there was already a user-friendly app released that would generate deepfakes with a few button presses, the subreddit’s rules didn’t mean that non-celebrities’ and children’s faces wouldn’t be used to generate the videos. They just wouldn’t be tolerated on r/deepfakes… which itself is no longer tolerated on Reddit.
…or on Discord, Twitter, Pornhub, Gfycat, Imgur (which is also known to remove deepfakes, according to what was a running list on r/deepfakes), and… well, there will likely be more deepfake-banning services by the time this article is posted, at the rate this is going.
Is the genie back in the bottle? Doubtful! R/deepfakes members had already been discussing taking the whole kit and kaboodle off Reddit et al. and setting up a distributed platform where they could happily AI-stitch away.
Reddit’s move is likely just a bump in the road. This technology isn’t going anywhere: it’s here to stay. And that’s a not necessarily a bad thing.
The main problem here hasn’t been the technology; it’s the fact that the video creators are using it to produce explicit content without the permission of those involved.
roleary
People discussing a child’s face on an adult’s body are missing the obvious here. You only need look at the Donnelly Trumpton videos to see how different a face looks pasted on a significantly different head. Better yet look out the Trudeau-Trump mashup. You wouldn’t be able to identify the face if you didn’t know.
More concerning is what kind of explicit video (already illegal in most of the world) you could realistically put a child’s face on.
Luna
It doesn’t have to be just a child but could be a minor as well. Let’s be real here there are plenty of teens who are programmers and are smart enough to get revenge on some ex in high school that broke their heart. Or just to do it for lulz. In America if she’s not 18 its illegal even if she gave consent. And with 14 yr olds looking nearly 20 these days it wouldn’t be that difficult to use their face.