In 2015, Reddit admitted that in terms of privacy, it had blown it.
In the blowback over Celebgate, Reddit banned the subreddit that was devoted to handing out the stolen images of nude celebrities, but it left the rest of us non-celebs to just go suck wind.
So it pulled a mea culpa move and fixed it, changing its privacy policy so that even we, the non-glitzy, can request to have non-voluntary porn images removed from the site.
But somehow, Reddit’s mea culpa privacy egalitarianism hasn’t yet addressed the new phenomenon of deepfakes.
As was widely reported last week, developers are using artificial intelligence (AI) to near-seamlessly stitch (mostly) female celebrities’ images onto bodies in live-action porn films. The neural network-generated fabrications, called deepfakes, are very difficult to discern as fakes and are getting ever more credible. Needless to say, there’s nothing consensual about the porn being created.
But we’ve not yet heard anything from Reddit about shutting down the Deepfakes subreddit which features the videos.
The same can’t be said for Discord, a chat service that on Friday shut down a user-created group that was spreading deepfake videos, citing their policy against revenge porn, which is another term for nonconsensual/involuntary porn.
The service reportedly did so after being contacted by Business Insider.
A company representative from Discord issued a statement:
Non-consensual pornography warrants an instant shut down on the servers whenever we identify it, as well as permanent ban on the users. We have investigated these servers and shut them down immediately.
Business Insider reports that the chat group, called “deepfakes,” included several channels in which users could discuss the technology or share videos that could be used to train the AI technology. About 150 Discord users were logged into the deepfakes server earlier on Friday, BI reports.
It sounds like the Discord deepfake users were trying to keep it decent. At least, they tried to keep it decent for other deepfake participants, stating in its rules that they shouldn’t swear at each other:
It goes without saying that you should respect each other. Be polite, avoid excessive swearing, and speak English at all times.
That politeness doesn’t extend to deepfakes subjects, the women dragged unwillingly into the porn videos. The rules did state that videos couldn’t be doctored to include images of “amateurs;” instead, only images of celebrities and public figures were allowed. The rules also barred images or videos of children.
…which apparently puts female celebrities into the category of “do whatever you want,” as if they’re lesser humans whose images can be used without permission or repercussion.
I’ve reached out to Reddit to try to find out if it has any plans to hold fakeapps accountable to its non-voluntary porn policy. I’ll update the story if I hear back.
Reddit, you did the right thing, eventually, when you took down the subreddit dedicated to sharing ripped-off material during Celebgate. It’s time to do the right thing again with deepfakes, which is just the latest spin on virtually mugging people for their images.
delayedthoughtengineering
Hopefully, these sorts of websites will be ready to destroy any content flagged not just by the individual affected, but by family members, and also if any identified individual in the deepfake (whether a head, a body, a reflection in a mirror, or something else) are known to be deceased. A person’s likeness should not automatically fall into public domain (or worse, simply not addressed at all when used without permission of the individual’s estate) after they are deceased.
I’m not just talking about the faces of popular deceased celebrities (think: Marilyn Monroe deepfakes), but the bodies of pornography performers. For example, a certain porn actress recently committed suicide. Is her work subject to being used in deepfakes more than a still-active actress? Worse and more likely, if a porn actress recently commits suicide, should her work be allowed to flood the deepfake arenas, simply due to popularity and temporary media coverage?
Typically, porn actors are relatively isolated from ordinary society by the very nature of their work, so when they are deceased, no-one remains to say “This video contains my daughter/mother/sister/aunt/friend. Please take it down.” Even pornography actors should be treated with respect after they are gone.