In 2015, Reddit admitted that in terms of privacy, it had blown it.
In the blowback over Celebgate, Reddit banned the subreddit that was devoted to handing out the stolen images of nude celebrities, but it left the rest of us non-celebs to just go suck wind.
So it pulled a mea culpa move and fixed it, changing its privacy policy so that even we, the non-glitzy, can request to have non-voluntary porn images removed from the site.
But somehow, Reddit’s mea culpa privacy egalitarianism hasn’t yet addressed the new phenomenon of deepfakes.
As was widely reported last week, developers are using artificial intelligence (AI) to near-seamlessly stitch (mostly) female celebrities’ images onto bodies in live-action porn films. The neural network-generated fabrications, called deepfakes, are very difficult to discern as fakes and are getting ever more credible. Needless to say, there’s nothing consensual about the porn being created.
But we’ve not yet heard anything from Reddit about shutting down the Deepfakes subreddit which features the videos.
The same can’t be said for Discord, a chat service that on Friday shut down a user-created group that was spreading deepfake videos, citing their policy against revenge porn, which is another term for nonconsensual/involuntary porn.
The service reportedly did so after being contacted by Business Insider.
A company representative from Discord issued a statement:
Non-consensual pornography warrants an instant shut down on the servers whenever we identify it, as well as permanent ban on the users. We have investigated these servers and shut them down immediately.
Business Insider reports that the chat group, called “deepfakes,” included several channels in which users could discuss the technology or share videos that could be used to train the AI technology. About 150 Discord users were logged into the deepfakes server earlier on Friday, BI reports.
It sounds like the Discord deepfake users were trying to keep it decent. At least, they tried to keep it decent for other deepfake participants, stating in its rules that they shouldn’t swear at each other:
It goes without saying that you should respect each other. Be polite, avoid excessive swearing, and speak English at all times.
That politeness doesn’t extend to deepfakes subjects, the women dragged unwillingly into the porn videos. The rules did state that videos couldn’t be doctored to include images of “amateurs;” instead, only images of celebrities and public figures were allowed. The rules also barred images or videos of children.
…which apparently puts female celebrities into the category of “do whatever you want,” as if they’re lesser humans whose images can be used without permission or repercussion.
I’ve reached out to Reddit to try to find out if it has any plans to hold fakeapps accountable to its non-voluntary porn policy. I’ll update the story if I hear back.
Reddit, you did the right thing, eventually, when you took down the subreddit dedicated to sharing ripped-off material during Celebgate. It’s time to do the right thing again with deepfakes, which is just the latest spin on virtually mugging people for their images.