Skip to content
Facebook
Naked Security Naked Security

Facebook’s counterintuitive way to combat nonconsensual porn

"Upload your nudes to stop revenge porn" might sound crazy but it actually makes sense.

In November 2017, Facebook came up with a way to help us keep our nude photos from being shared without our consent.
It sounded crazy: in order to save us from revenge/nonconsensual porn, Facebook wanted us to send in our nude photos.
But it actually made sense: Facebook would create hashes of our nude images, just like law enforcement uses hashes of known child abuse imagery.
Facebook promised that it wouldn’t store the nudes itself but would instead use photo-matching technology to tag the images after they’re sent in. Then, if somebody tried to upload that same image, which would have the same digital footprint or hash value, it would be stopped dead in its tracks before being uploaded.
People can already report intimate images that have been shared without their consent. Such images are removed, and Facebook creates the hash so its systems can automatically recognize an image and block it if somebody tries to post the image again.
But Facebook says it can do more to keep nudes from being shared on its services in the first place.
On Tuesday, Facebook Global Head of Safety Antigone Davis announced that this week, Facebook’s testing a proactive reporting tool, in partnership with an international working group of safety organizations, survivors, and victim advocates, including the Australian Office of the eSafety Commissioner, the Cyber Civil Rights Initiative and The National Network to End Domestic Violence (NNEDV) in the US, the UK Revenge Porn Helpline, and YWCA Canada.
The pilot program first launched in Australia. Now, it’s also going to be tested in the UK, the US and Canada.
Facebook’s work on the project has included Davis and her team traveling to nine countries across four continents, from Kenya to Sweden, listening to stories about the abuse and cruelty that women face online. While people of all genders, ages and sexual orientations are targeted, Davis notes that women are nearly twice as likely to be targeted with nonconsensual/revenge porn as men.
From her public post:

From anxiety and depression to the loss of a personal relationship or a job, this violation of privacy can be devastating.

The photo hashing project is a work in progress, but here’s how it works now:

  1. Contact one of Facebook’s partners to submit a form: see the links above.
  2. You’ll then receive an email containing what Facebook says is a secure, one-time upload link.
  3. Use the link to upload images you fear will be shared.
  4. A “specifically trained” human – just one, Facebook said – from its Community Operations Safety Team will review the image to make sure it violates Facebook policy against nudity and sexuality. If it does, they’ll create a hash that will allow the platform to identify future uploads of the image(s) without keeping copies of them on its servers.
  5. Facebook will then notify victims via email and will delete the images from its servers “no later” than within a week.
  6. Facebook will store the hashes so any time someone tries to upload an image with the same hash, it will be blocked on its services – that includes Facebook, Instagram or Messenger.

True, initially, you do have to hand over the photo in question in order to create the hash. But after that, the hash will be able to help the online platform more or less instantly answer the question “Do I know that photo?” – and to block its reposting – without you having to send the photo again.
The initial Australian pilot raised questions that Facebook has since tried to tackle. For example, what about false reporting? What safeguards are in place to ensure that people can’t take any old picture they want – a non-porn publicity photo, for example – and send it in, under the false premise that it’s a nude and that it’s a photo they themselves have the rights to have expunged from social media circulation?
As Facebook’s Chief Security Officer Alex Stamos tweeted in November, that’s why we have to trust the humans whose eyes will be reviewing the photos …and why those photos won’t be blurred:
https://twitter.com/alexstamos/status/928741990614827008
Do you trust Facebook with content as sensitive as this? It’s record on privacy isn’t good but its record on security is.
I’m inclined to think that this is a good step, at any rate. Hashing is an important tool in the battle to keep child abuse imagery offline, and it makes sense to apply it in the battle against revenge porn.

A primer on image hashing

This is how it works: A hash is created by feeding a photo into a hashing function. What comes out the other end is a digital fingerprint that looks like a short jumble of letters and numbers. You can’t turn the hash back into the photo but the same photo, or identical copies of it, will always create the same hash.
So, a hash of your most intimate picture is no more revealing than this:
48008908c31b9c8f8ba6bf2a4a283f29c15309b1
Since 2008, the National Center for Missing & Exploited Children (NCMEC) has made available a list of hash values for known child sexual abuse images, provided by ISPs, that enables companies to check large volumes of files for matches without those companies themselves having to keep copies of offending images.
Microsoft at one point donated its PhotoDNA technology to the effort. Facebook’s likely using its own sophisticated image recognition technology for the nude-images project, but it’s instructive to look at how PhotoDNA works.
PhotoDNA creates a unique signature for an image by converting it to black and white, resizing it, and breaking it into a grid. In each grid cell, the technology finds a histogram of intensity gradients or edges from which it derives its so-called DNA. Images with similar DNA can then be matched.
Given that the amount of data in the DNA is small, large data sets can be scanned quickly, enabling companies including Microsoft, Google, Verizon, Twitter, Facebook and Yahoo to find needles in haystacks and sniff out illegal child abuse imagery. It works even if the images have been resized or cropped.
Davis says that the photo hashing is just one step “to help people who fear an intimate image will be shared without their consent.”

We look forward to learning from this pilot and further improving our tools for people in devastating situations like these.


7 Comments

What happens with altered images? Your face on a porn star’s body? The real you with explicit tattoos added in strategic places?
And if it is a revenging ex partner, you may not have your own copy to submit.

Reply

I too was thinking about the altered images thing… it looks like they’re doing their best if they’ve got a model similar to the PhotoDNA thing. At any rate, if they get a hash that matches, they can block it. That’s a good step forward.

Reply

Minor alterations don’t evade photoDNA, so we can safely assume facebook’s implementation will also catch them fairly reliably. Deepfakes etc and images you don’t have access to to submit in the first place will evade it (deepfakes may not? Facebook have some impressive facial recognition software to help here), but that doesn’t mean its not worth while, and Facebook have acknowledged that this isn’t a silver bullet, just one weapon in the arsenal fighting against awful things like revenge porn.
I normally despise Facebook, but this I’m entirely in favour of.

Reply

Hopefully people will understand that Facebook will only provide an upload link via email for such sensitive photos if they initiated the request in the first place… I guess they’re so concerned about false reporting that they don’t want to let people generate their own hashes. However, it seems like they can review images when a hash matches, and error on the safe side. e.g. If a submitted hash matches a nude public image on Facebook, then delete it since it’s already violating their terms.

Reply

Why not allow the person, who is concerned about images being uploaded, create the hash and upload the hash. When someone uploads an image with a matching hash then the image is reviewed and if it breaches the standards, the image is removed and the hash is marked as valid. Meaning further images with the same hash cannot be uploaded. If the image does not breach the standard the hash is marked as invalid and the hash cannot be uploaded again. That way images are only looked at when someone else tries to upload them and all nude images are not looked at.

Reply

color me old fashioned….. but why on earth would anybody put nude photos on to any kind of public area?! If you don’t do that in the first place, you don’t have to gain ulcers worrying that they are hacked!! It is not enlightened or feministic or any other label to put nude photos anywhere, it’s just plain dumb!

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!