Virginia has expanded its ban on nonconsensual pornography to include deepfakes, or what it’s calling the distribution of “falsely created” images and videos.
The updated law, which went into effect on Monday, expands an existing law that says anyone who shares or sells nude or sexual images and videos to “coerce, harass, or intimidate” is guilty of a Class 1 misdemeanor. Violators could face up to 12 months in prison and up to $2,500 in fines.
Here’s the new rule, with the added language about deepfakes in bold:
Any person who, with the intent to coerce, harass, or intimidate, maliciously disseminates or sells any videographic or still image created by any means whatsoever, including a falsely created videographic or still image, that depicts another person who is totally nude, or in a state of undress so as to expose the genitals, pubic area, buttocks, or female breast, where such person knows or has reason to know that he is not licensed or authorized to disseminate or sell such videographic or still image is guilty of a Class 1 misdemeanor.
It’s not just Virginia. Deepfakes are getting plenty of people plenty worried. It started out as a way to do artificial intelligence (AI)-generated face-swaps on porn stars, politicians and other celebrities, but it’s since evolved. The latest iteration came out last week, in the form of DeepNude: a $50 app that automatically undressed a photo of any woman with a single click, swapping the clothes for breasts and a vulva.
After Motherboard reported on DeepNude, the internet threw up at the thought, motivating the anonymous creator of this app, going by the name Alberto, to allegedly shut it down.
Despite the safety measures adopted (watermarks) if 500,000 use it the probability that people will misuse it will be too high. […] The world is not yet ready for DeepNude.
Oh, Alberto, the world is sooooo ready for DeepNude. It’s so ready, it’s spread on file-sharing networks across the land regardless of your purportedly shutting it down. That horse didn’t just escape that barn before you shut the door. It tucked a copy of your compiled code into its saddle bag, hopped on a transatlantic and decided to backpack across Siberia for summer break.
As The Register reported on Tuesday, the DeepNude app, sold in Windows and Linux versions, was downloaded by “hordes of internet pervs,” who are now sharing the packages on file-sharing networks.
They’re not just sharing it: they’re reverse-engineering it, and they’ve reportedly even improved on its creators’ buggy first release, and have also reportedly removed the automatic placing of watermarks on generated images that labeled the doctored photos as “fake.” Not that those watermarks couldn’t be rinsed off with Photoshop in a nanosecond, mind you.
At any rate, although Virginia may be the first to get a deepfakes ban enacted, it’s looking like it won’t be the last.
Here’s a list of pending bills to regulate deepfakes:
- There’s a bipartisan effort going on in Congress: Sen. Ben Sasse (R-NE) and Rep. Yvette Clarke (D-NY) have both introduced bills that would regulate deepfakes.
- Texas passed its own anti-deepfakes law, which goes into effect on 1 September 2019. However, it addresses election manipulation, not nonconsensual porn.
- New York is considering a bill that would ban creating “digital replicas” of people without their consent – a bill that the Motion Picture Association of America is leery about, given that it would “restrict the ability of our members to tell stories about and inspired by real people and events.”