Site icon Sophos News

Facebook’s counterintuitive way to combat nonconsensual porn

Facebook

In November 2017, Facebook came up with a way to help us keep our nude photos from being shared without our consent.
It sounded crazy: in order to save us from revenge/nonconsensual porn, Facebook wanted us to send in our nude photos.
But it actually made sense: Facebook would create hashes of our nude images, just like law enforcement uses hashes of known child abuse imagery.
Facebook promised that it wouldn’t store the nudes itself but would instead use photo-matching technology to tag the images after they’re sent in. Then, if somebody tried to upload that same image, which would have the same digital footprint or hash value, it would be stopped dead in its tracks before being uploaded.
People can already report intimate images that have been shared without their consent. Such images are removed, and Facebook creates the hash so its systems can automatically recognize an image and block it if somebody tries to post the image again.
But Facebook says it can do more to keep nudes from being shared on its services in the first place.
On Tuesday, Facebook Global Head of Safety Antigone Davis announced that this week, Facebook’s testing a proactive reporting tool, in partnership with an international working group of safety organizations, survivors, and victim advocates, including the Australian Office of the eSafety Commissioner, the Cyber Civil Rights Initiative and The National Network to End Domestic Violence (NNEDV) in the US, the UK Revenge Porn Helpline, and YWCA Canada.
The pilot program first launched in Australia. Now, it’s also going to be tested in the UK, the US and Canada.
Facebook’s work on the project has included Davis and her team traveling to nine countries across four continents, from Kenya to Sweden, listening to stories about the abuse and cruelty that women face online. While people of all genders, ages and sexual orientations are targeted, Davis notes that women are nearly twice as likely to be targeted with nonconsensual/revenge porn as men.
From her public post:

From anxiety and depression to the loss of a personal relationship or a job, this violation of privacy can be devastating.

The photo hashing project is a work in progress, but here’s how it works now:

  1. Contact one of Facebook’s partners to submit a form: see the links above.
  2. You’ll then receive an email containing what Facebook says is a secure, one-time upload link.
  3. Use the link to upload images you fear will be shared.
  4. A “specifically trained” human – just one, Facebook said – from its Community Operations Safety Team will review the image to make sure it violates Facebook policy against nudity and sexuality. If it does, they’ll create a hash that will allow the platform to identify future uploads of the image(s) without keeping copies of them on its servers.
  5. Facebook will then notify victims via email and will delete the images from its servers “no later” than within a week.
  6. Facebook will store the hashes so any time someone tries to upload an image with the same hash, it will be blocked on its services – that includes Facebook, Instagram or Messenger.

True, initially, you do have to hand over the photo in question in order to create the hash. But after that, the hash will be able to help the online platform more or less instantly answer the question “Do I know that photo?” – and to block its reposting – without you having to send the photo again.
The initial Australian pilot raised questions that Facebook has since tried to tackle. For example, what about false reporting? What safeguards are in place to ensure that people can’t take any old picture they want – a non-porn publicity photo, for example – and send it in, under the false premise that it’s a nude and that it’s a photo they themselves have the rights to have expunged from social media circulation?
As Facebook’s Chief Security Officer Alex Stamos tweeted in November, that’s why we have to trust the humans whose eyes will be reviewing the photos …and why those photos won’t be blurred:

Do you trust Facebook with content as sensitive as this? It’s record on privacy isn’t good but its record on security is.
I’m inclined to think that this is a good step, at any rate. Hashing is an important tool in the battle to keep child abuse imagery offline, and it makes sense to apply it in the battle against revenge porn.

A primer on image hashing

This is how it works: A hash is created by feeding a photo into a hashing function. What comes out the other end is a digital fingerprint that looks like a short jumble of letters and numbers. You can’t turn the hash back into the photo but the same photo, or identical copies of it, will always create the same hash.
So, a hash of your most intimate picture is no more revealing than this:
48008908c31b9c8f8ba6bf2a4a283f29c15309b1
Since 2008, the National Center for Missing & Exploited Children (NCMEC) has made available a list of hash values for known child sexual abuse images, provided by ISPs, that enables companies to check large volumes of files for matches without those companies themselves having to keep copies of offending images.
Microsoft at one point donated its PhotoDNA technology to the effort. Facebook’s likely using its own sophisticated image recognition technology for the nude-images project, but it’s instructive to look at how PhotoDNA works.
PhotoDNA creates a unique signature for an image by converting it to black and white, resizing it, and breaking it into a grid. In each grid cell, the technology finds a histogram of intensity gradients or edges from which it derives its so-called DNA. Images with similar DNA can then be matched.
Given that the amount of data in the DNA is small, large data sets can be scanned quickly, enabling companies including Microsoft, Google, Verizon, Twitter, Facebook and Yahoo to find needles in haystacks and sniff out illegal child abuse imagery. It works even if the images have been resized or cropped.
Davis says that the photo hashing is just one step “to help people who fear an intimate image will be shared without their consent.”

We look forward to learning from this pilot and further improving our tools for people in devastating situations like these.


Exit mobile version