Skip to content
Naked Security Naked Security

Facebook staff’s private emails published by fake news inquiry

The cache of seized Facebook documents show how Facebook whitelists certain companies so they can keep lapping up user data.

Want to know what Mark Zuckerberg and his underlings really think about us users?
Get ready to read ’em and weep: against the wishes of the Facebook CEO, the UK parliament’s inquiry into fake news has published confidential correspondence between Zuck and his staff.
That correspondence has some revealing stuff in it. But first, how did the Parliament’s Digital, Culture, Media, and Sport (DCMS) committee – which has been overseeing inquiries into Facebook’s privacy practices – get their hands on it?
Well, it has to do with bathing suit photos. A now-defunct app called Six4Three that searched for Facebook users’ bathing suit photos is embroiled in a years-long lawsuit against Facebook.
Six4Three alleges that Facebook suddenly changed the terms of how it allowed developers to access Facebook’s Graph API generally, and its Friends’ Photos Endpoint, specifically. Six4Three made an app known as “Pikinis” that specifically sought out bikini photos across Facebook users’ friends pages. In April 2015, Six4Three sued Facebook, claiming that Facebook’s sudden yanking of access rendered both the app and the company itself “worthless.”
According to a court filing from last week, Six4Three managing director Ted Kramer met with MP Damian Collins in his London office on 20 November. Collins told Kramer that he was under active investigation, that he was in contempt of parliament, and that he could potentially face fines and imprisonment.
Kramer is then said to have “panicked” and whipped out a USB drive before frantically searching his Dropbox account for relevant files obtained under civil discovery. He looked for any files whose names suggested they might be relevant, dragged them onto the USB drive without even opening them, and handed over the USB stick – in spite of Facebook having labelled the documents highly confidential, and “against the explicit statements by counsel in the above referenced communications,” according to last week’s filing.
That’s it in a nutshell. Check out write-ups from Ars Technica and from The Observer, which broke the news, for more details about the case and the incident: it’s a hell of a sticky legal wicket when it comes to limits of British authorities’ legal reach with international companies such as Facebook.
As it is, Facebook has steadfastly refused to appear before MPs to explain the company’s moves with regards to fake news. MP Collins, head of the committee, says that the Six4Three case in the US suggested another option of getting the information the committee sought. The Observer quoted him:

We have followed this court case in America and we believed these documents contained answers to some of the questions we have been seeking about the use of data, especially by external developers.

When it comes to the Cambridge Analytica user data fiasco, Six4Three alleges that the correspondence shows that Facebook was not only aware of the implications of its privacy policy, but actively exploited them. Collins and his committee were particularly interested in the app company’s assertions that Facebook intentionally created and effectively flagged up the loophole that Cambridge Analytica used to collect user data.

On Wednesday, the parliamentary committee published about 250 pages of the correspondence, some of which are marked “highly confidential”.
These are the key issues found in the correspondence that MP Collins highlighted in his introductory note:

  • In 2014/2015, Facebook limited the data on users’ friends that developers could see. Regardless, it kept a whitelist of certain companies that it allowed to maintain full access to friend data. Collins said that it’s “Not clear that there was any user consent for this, nor how Facebook decided which companies should be whitelisted.”
  • Collins says that Facebook knew that changing its policies on the Android mobile phone system to enable the Facebook app to collect a record of users’ calls and texts would be controversial …so the plan was to bury it deep. “To mitigate any bad PR, Facebook planned to make it as hard as possible for users to know that this was one of the underlying features of the upgrade of their app,” Collins said.
  • You might recall that up until recently Facebook had been pushing people to download a virtual private network (VPN) app, Onavo, that it acquired in 2013 for “protection” …without mentioning that it was phoning home to Facebook to deliver users’ app usage habits, even when the VPN was turned off. In August, Apple suggested that Facebook remove Onavo from the App Store due to privacy violations. Collins wrote that, apparently without users’ knowledge, Facebook had been using Onavo to conduct global surveys of what mobile apps its customers were using. Then, it used that data to figure out not just how many people had downloaded apps, but how often they used them: useful knowledge when it came to deciding “which companies to acquire, and which to treat as a threat,” Collins wrote.
  • The files contain evidence that when Facebook took aggressive positions against apps and turned off their access to data, it sometimes led to businesses failing.
  • Twelve of the Six4Three documents include discussions on businesses that got whitelisted when it came to access to users’ friend data. The whitelisted firms include the dating service Badoo, its spin-off Hot or Not, and the dating app Bumble, which Badoo had invested in; Lyft; Netflix; and Airbnb. Facebook didn’t whitelist just any old company, though: it denied the friends data firehose API to companies including Ticketmaster, Vine, and Airbiquity, a connected-cars company.

Below is one of many email extracts published on Wednesday that show how Facebook has targeted competitor apps. It’s about shutting down access to users’ friend data to Vine, which was Twitter’s short-video service:

Facebook email 24 January 2013
Justin Osofksy (Facebook vice president):
‘Twitter launched Vine today which lets you shoot multiple short video segments to make one single, 6-second video. As part of their NUX, you can find friends via FB. Unless anyone raises objections, we will shut down their friends API access today. We’ve prepared reactive PR, and I will let Jana know our decision.
Mark Zuckerberg:
‘Yup, go for it.’

And here’s an excerpt from a discussion dated 4 February 2015 about giving Facebook’s Android app permission to read users’ call logs in such a way that they wouldn’t see a permissions dialog:

Michael LeBeau (Facebook product manager):
‘He guys, as you know all the growth team is planning on shipping a permissions update on Android at the end of this month. They are going to include the ‘read call log’ permission, which will trigger the Android permissions dialog on update, requiring users to accept the update. They will then provide an in-app opt in NUX for a feature that lets you continuously upload your SMS and call log history to Facebook to be used for improving things like PYMK, coefficient calculation, feed ranking etc. This is a pretty highrisk thing to do from a PR perspective but it appears that the growth team will charge ahead and do it.’
Yul Kwon (Facebook product manager):
‘The Growth team is now exploring a path where we only request Read Call Log permission, and hold off on requesting any other permissions for now.
‘Based on their initial testing, it seems this would allow us to upgrade users without subjecting them to an Android permissions dialog at all.
‘It would still be a breaking change, so users would have to click to upgrade, but no permissions dialog screen.’

Facebook told the BBC that the documents have been presented in a “very misleading manner” and required more context. It quoted a Facebook spokeswoman:

We stand by the platform changes we made in 2015 to stop a person from sharing their friends’ data with developers.
Like any business, we had many internal conversations about the various ways we could build a sustainable business model for our platform.
But the facts are clear: we’ve never sold people’s data.

Zuckerberg also posted a response on his Facebook page. In it, he put context around the company’s decisions, including its efforts to fight “sketchy apps” such as the quiz that led to the Cambridge Analytica situation.

I understand there is a lot of scrutiny on how we run our systems.
That’s healthy, given the vast number of people who use our services around the world, and it is right that we are constantly asked to explain what we do. But it’s also important that the coverage of what we do – including the explanation of these internal documents – doesn’t misrepresent our actions or motives. This was an important change to protect our community, and it achieved its goal.


Zuckerberg has only himself to blame – having ignored requests by the UK Parliament to appear in person and explain what is going on, when the Cambridge Analytica saga has shown that there is a high probability that the very contentious June 2016 EU referendum may have been unduly influenced, is not the sign of a good leader in control.
Many of us are probably ready for “Safebook” where we pay a modest sum for use of a service that doesn’t scrape and sell our data to the highest, most dubious bidder.


I’m interested to know how the Cambridge Analytica saga has “shown” that Mr Zuckerberg had any measurable effect on the June 2016 referendum at all.
Like a US presidential election, a referendum is a straight-out two-horse race. If the results are widely expected to be fairly close anyway (and I think everyone has to agree that no one was expecting a landslide either way), then you simply have to expect that things could go either way. Additionally, referendums are very rare in the UK, so many people voting in the last one will only ever have experienced regular election ballots before. If you are determined to find a reason why the British people “inadvertently” voted the wrong way, other than that more people held different views than you like to think, I’d start by pondering how many people voted *as they might in an election*, making their choice on the basis that they’d get a legally mandated chance to change their minds within at most the next five years, which is the maximum time that can elapse between UK parliamentary elections except under extraordinary circumstances. (WW2 was an example of such extraordinariness.)
In short, if you want a reason for why the referendum ended as it did, you have dozens of much more likely ones than a “high probability” that undue influence was exerted by a pseudoscientific psychometric app, if that is not a tautology, published on a global social media platform.
Not batting for Zuck or FB here, or even suggesting that the Poms made a wise choice in thinking it would be smart to (try to) leave the EU, just sticking up for scientific method and critical reasoning.


With respect, this isn’t about the result of the referendum, it’s about the validity of the vote based on any external factors. Any state wide vote should be investigated if there is a suggestion of wrongdoing.
Pushing this aside for a second, Zuckerbergs refusal to answer questions that form part of an investigation is why we are here. The British legal systems powers are far reaching and are being used to (rightly) investigate the referendum.


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!