Wondering if your boyfriend really, really did delete that photo of you naked, wearing a sports championship medal, as he said he would (but which he didn’t, in the case of the Richmond Football Club’s Nathan Broad and the image of a young woman with the sports memorabilia on her bare chest that he told her he’d delete but instead shared)?
Facebook wants you to stop worrying about your nudes being shared without your consent like that. It wants you to get to that worry-free state by sending it your nude photos.
WHAAAA????
Stop, breathe. It actually makes sense: Facebook hasn’t given much detail, but from what little has been shared it sounds like it’s planning to use hashes of our nude images, just like law enforcement uses hashes of known child abuse imagery.
A hash is created by feeding a photo into a hashing function. What comes out the other end is a digital fingerprint that looks like a short jumble of letters and numbers. You can’t turn the hash back into the photo but the same photo, or identical copies of it, will always create the same hash.
So, a hash of your most intimate picture is no more revealing than this:
48008908c31b9c8f8ba6bf2a4a283f29c15309b1
Since 2008, the National Center for Missing & Exploited Children (NCMEC) has made available a list of hash values for known child sexual abuse images, provided by ISPs, that enables companies to check large volumes of files for matches without those companies themselves having to keep copies of offending images or to actually pry open people’s private messages.
The hash originally used to create unique file identifiers was MD5, but Microsoft at one point donated its own PhotoDNA technology to the effort.
PhotoDNA creates a unique signature for an image by converting it to black and white, resizing it, and breaking it into a grid. In each grid cell, the technology finds a histogram of intensity gradients or edges from which it derives its so-called DNA. Images with similar DNA can then be matched.
Given that the amount of data in the DNA is small, large data sets can be scanned quickly, enabling companies including Microsoft, Google, Verizon, Twitter, Facebook and Yahoo to find needles in haystacks and sniff out illegal child abuse imagery. It works even if the images have been resized or cropped.
Mind you, we don’t know if that’s the technology Facebook’s planning to use. It’s announced a pilot program with four countries—the UK, the US, Australia and Canada—in which people will typically be advised to send the photos to themselves via Messenger.
Julie Inman Grant, Australia’s e-safety commissioner, whose office is working with Facebook, told ABC News in Australia that sending photos via Messenger would be enough to enable Facebook to take action to prevent any re-uploads, without the photo being stored or viewed by employees.
Facebook says that it won’t be storing nude pictures but will use photo-matching technology to tag the images after they’re sent via its encrypted Messenger service. Then, Inman Grant said, “if somebody tried to upload that same image, which would have the same digital footprint or hash value, it will be prevented from being uploaded”.
The scheme’s being trialed first in Australia and will soon be tested in Britain, the US and Canada. At present, Facebook users can report photos of themselves that have already been posted nonconsensually or maliciously. Once the images are flagged, Facebook’s in-house teams review them, using hashing to prevent them from being re-uploaded.
Under the pilot scheme, users can act preemptively by notifying safety organizations working with Facebook about specific photos.
True, initially, you do have to hand over the photo in question in order to create the hash. But after that, the hash will be able to help the online platform more or less instantly answer the question “Do I know that photo?”—and to block its reposting—without you having to send the photo again.
We’d like to see a lot more detail from Facebook on this. For example, what safeguards are in place to ensure that people can’t take any old picture they want—a non-porn publicity photo, for example—and send it in, under the false premise that it’s a nude and that it’s a photo they themselves have the rights to have expunged from social media circulation?
The few details that have been revealed about this program look promising, but Facebook needs to put some flesh on its bones. If it responds to my questions, I’ll let you know.
Updates as of 2017-11-10:
Facebook have since confirmed how the pilot program works in a blog post.
Here’s how it works:
- Australians can file a report on the eSafety Commissioner’s official website.
- The eSafety Commissioner’s office notifies Facebook, but doesn’t have access to the photo.
- To identify the image to Facebook, people send it to themselves on Messenger.
- A member of Facebook’s Community Operations team reviews and hashes the image.
- Facebook stores the hash — not the photo — in its database to prevent future uploads.
- The person deletes the photo from Messenger and Facebook deletes it from its servers.
- Facebook prevents image uploads that match the stored hash from being posted or shared.
What the recent blog post doesn’t specify, Motherboard confirms, is that images sent for review aren’t blurred out, as originally explained by a Facebook spokesperson.
Facebook’s Chief Security Officer Alex Stamos tweeted:
https://twitter.com/alexstamos/status/928741990614827008
Aaron
Wouldn’t it be better to let people generate the hash themselves and submit it to Facebook and others to add the blacklist? Rather than further propogating sensitive data to 3rd parties…
Although this could then be abused and used for censorship of valid photos, but if Facebook don’t manually observe the material being hashed, there are no checks and balances anyway.
Paul Ducklin
It’s not quite as simple as a straight cryptographic hash of a single file, and computing what’s being called a hash in this context isn’t a simple as processing just the image of interest, and might require real-time access to an existing database of images and existing image hashes, making it impractical to do offline.
My own opinion is that the Australian public service haven’t done us much of a favour by giving such a stripped-down, hand-waving explanation of how this works. (And, as Lisa implies, it sounds as though you are supposed to contact some sort of government agency first – after all, it’s very likely that in many cases of this sort, the victim doesn’t actually have an original copy of the offending image to work with – if it wasn’t a selfie, then the original file won’t be on the victim’s phone, but on someone else’s. That, one imagines, makes the issue of who “owns” the image and what legalistic protections surround its use much more complicated – and perhaps also make the legal consequences of abusing it much more serious.)
Bryan
It’s intriguing to me that the PhotoDNA works even if the image is resized. My first thought was how few pixels can be altered to subvert the hash comparison?
RichardD
“… the image of a young *girl* with the sports …”
Not wanting to defend the creep, but the article you’ve linked to only refers to a “young *woman*”. Is there any reason to believe the victim in this case was under-age, as that paragraph seems to suggest?
Lisa Vaas
Thank you so much for pointing out that glitch. I have no idea how the word “girl” got stuck in my head, but as you point out, it’s inaccurate. We’ll edit that post haste. Thanks again.
Mark Stockley
Fixed!
Niko
The ABC news article you cited uses the phrase “young girl”. So that’s where it came from.
Osman GURSOY
Afterall Why there is public nudity? Ban it and it will be solved once and for all.
FreedomISaMYTH
until the day humans deem hashes as porn…
Mike B
I no longer see numbers…. I just see blonde, brunette, and a redhead :-)
Cetin A.
nice reference from 1999. Where did the years go? :)
Bryan
Facebook needs to put some flesh on its bones
ba dump bump!
Aaron Runkle
The irony of this being posted on nakedsecurity…
Will
PhotoDNA technology (or similar) is perhaps the most promising here. I suspect it won’t be long before people start cropping or modifying images every so slightly as to change a simple hash if that’s what is used; even more likely if there’s some message that alerts the user immediately that they’re attempting to upload a prohibited or disallowed image, sadly.
Mahhn
(not talking about the sexual files)
How about we just get over hang ups about the human body and just get rid of laws about nudity. Everybody has one, yes it’s special, just like everyone else’s. It’s how everyone was made, we all get ugly as we get old. Just imagine all the BS that would vanish in life if nudity meant nothing. Yeah maybe I don’t want to put pants on, so what. I won’t tell you to, and I won’t stare, well, only for the count of 1.
AL
But changing a single bit in a file (picture or otherwise) should result in an entirely different hash value. So unless the ‘ex’ is trying to post the exact same file that was uploaded initially, I don’t see how this works.
Was something left out of the article that addresses this issue? After all, the very definition of a hash is that it generates a fixed-length value regardless of the size of the file. Only one bit has to be changed in order to result in a new hash value.
One other thought regarding the definition of a hash, no two different files should generate the same hash value. When this happens, it is known as a collision and this is generally not a good thing.
Paul Ducklin
In an earlier comment, I alluded to the same sort of confusion I felt myself. I think that the Australian government department who bigged this thing up should have provided a bit more detail about what sort of protection this system can provide, and how.
Let’s assume the system can do reliable fuzzy matching of images…
…how do I establish that I really am the one with copyright of or legal control over the image I just messaged to myself? Do I have to upload the image before anyone else does in order to get “first dibs” on it?
If I don’t have the original image because I didn’t take the photo in the first place, then what?
As Lisa said in the article, we’ll have wait and see what we can find out about how this process actually works before we can judge its likely effectiveness.
Cetin A.
A means of image recognition and filtering like that could also be used to block news footage from social media/web sites if it doesn’t fit to some authorities’ taste.
David Bradbury
So simple – if you dont want ex-partners ( or soon to be ex-partners) to post nude photos of you on social media, dont take or allow some to take nude photos of yourself. If nude photos are taken without consent on private property and uploaded to the net, sue the person posting them. ( Uk laws , unsure about ROW)
How can you actually trust a company that wont allow photos of women breastfeeding their baby but allows all sorts of violent images including murder and suicides claiming they can’t identify them.
Chew Bots
Is this a joke? So all anyone would have to do is crop the image or change it in anyway possible and then it would be unblocked. Kind of pointless if you ask me.
Someguywhoknowshashes
If this system works whether the image has been resized or cropped it then it’s not truly a hashing function which are intended to give a different hash value.if even one bit has been changed
Paul Ducklin
The term “hash” is used quite loosely and includes fingerprinting techniques as diverse as basic checksums (where there is no cryptographic security and it’s easy to create files with specific hashes), message authentication codes like SHA-256 (where it’s as good as impossible to rustle up a file to match a hash), and fuzzy hashing (where the hash can be used to match files against st others with certain degrees of similarity).
Justin
There’s absolutely no reason these photos need to be uploaded to generate an MD5 when you can have a piece of stand alone desktop software that doesn’t need an internet connection generate these hashes. This is all about Facebook collecting data under the false notion that this provides a sense of security.
John Richard Wilson
Off the bat I cannot see this working successfully. Sure you send a picture, they create a hash and the disgruntled ex can’t post that exact image on Facebook. So far the concept sounds good. But all someone would need to do is open the image on MS Paint, add a small insignificant scribble and the image’s hash will be different. The post that one and since it generates a different hash, Facebook’s system will not see it as something to prevent being posted and the disgruntled ex humiliates their ex. Unless there are other mechanisms in place that Facebook have not released that could prevent such a flaw.
Mark Stockley
That’s where technology like PhotoDNA and fuzzy hashes come in.
Max
So, reading the update, sounds like Facebook will soon pay people full time to look at nudes from their users?
Paul Ducklin
It sounded to me as though the Facebook Community Operations Team won’t get involved unless and until you have created an official report with the relevant Aussiue government department. So describing this Community Operations Team as “people who get paid to look at users’ nude photos” is a bit unfair…
Max
As I understood it, the Facebook employees will be looking at the nudes to make sure it’s not a fraudulent report, i.e. a picture of someone that’s not you. So, while you might prevent your picture from being seen by everyone on the internet, at least one person you didn’t want to see it will have to see it. Also, if Facebook has to make sure it’s a valid report, what exactly is the point of the eSafety Commission.
Paul Ducklin
Don’t know. Part of me wants to congratulate the public servants in this case for trying; but another part of me thinks this might be a somewhat complicated hammer looking for a nail. (That’s my personal opinion, of course :-)
Anonymous
“Hammer looking for a nail” .. lolz
TW
Facebook: send nudes pls… I promise to delete…
What could possibly go wrong???