Sophos News

Deepfakes AI porn GIFs purged from Gfycat platform

Redditors who are in on the creation of videos with celebrities’ faces stitched in via artificial intelligence (AI) are not pleased.
Their favorite service to post what are called deepfakes – Gfycat – has begun to purge their creations.
On Friday, it was the chat service Discord that shut down a channel that was spreading deepfake videos, citing its policy against non-consensual porn.
It was Motherboard that first discovered the r/deepfakes subreddit in December, and it was Motherboard again that picked up on it when r/deepfakes users on Tuesday started to comment about their gifs being taken down from the image hosting platform Gfycat.
Motherboard says a Gfycat spokesperson confirmed that it finds deepfakes “objectionable” and is deleting them from the site. Speaking to The Verge, a spokesperson said that its terms of service give it the right to remove such content:

Our terms of service allow us to remove content that we find objectionable. We are actively removing this content.

For its part, Discord cited its policy against revenge porn, which is another term for nonconsensual/involuntary porn.


There are many alternative sites to upload content, r/deepfakes users noted. But given the glare of the public eye that’s been turned onto the deepfakes phenomenon in the past week, which of those sites won’t melt?
That’s why one of the Redditors suggested to the group that they should construct their own distributed system to upload the deepfakes content – one where their “hard work won’t go unrecognized” – and stop relying on others. So far, he said, they have a team of specialists that includes a “crypto specialist” to ensure that deepfakes continues to get coding support for “this incredible contribution to global [masturbation].”
From his post:

Bottom line: this community is in its infancy. Replacing faces is just the beginning. We all know that the future of fantasy is in this tech, and the more we build it the more fun we’ll have.
Lets give women a place to express their sexuality with varying levels of anonymity.

He didn’t explain what “varying levels of anonymity” means, but the comment was note-worthy in that it actually mentions anything whatsoever relating to the concept of privacy for the women (it’s mostly women) who are cast in these videos.
Meanwhile, there must be a cone of silence that’s been lowered over Reddit, which hasn’t responded to multiple requests for comment. As it is, its content policy prohibits involuntary porn.
But it’s not clear whether it would recognize deepfakes as involuntary porn. Given its silence, it well could be weighing its commitment to avoid censorship against its own involuntary porn policy, though its history does include removing violent content and the stolen images of celebrities.
At any rate, Reddit’s policy stipulates that the subjects portrayed in involuntary porn need to make a complaint.