Naked Security Naked Security

Facebook acts like a law-breaking ‘digital gangster’, says official report

Facebook considers itself to be “ahead of and beyond the law,” UK lawmakers said in a report about "disinformation and 'fake news.'"

On Sunday, following an investigation of more than a year, the UK Parliament accused Facebook of thumbing its nose at the law, having “intentionally and knowingly violated both data privacy and anti-competition laws”.

Lawmakers called for the Information Commissioner’s Office (ICO) to investigate the social media platform’s practices, including how it uses the data of both users and users’ friends, as well as its use of “reciprocity” in data sharing.

Their report, which centered on disinformation and fake news, was published by a House of Commons committee – the Digital, Culture, Media and Sport Committee – that oversees media policy. From that report:

Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.

The investigation focused on Facebook’s business practices before and after the Cambridge Analytica scandal.

Facebook shouldn’t be allowed to wriggle out from under culpability for the content users have pushed through on its platforms, the report said, alluding to how it was used by foreigners to tinker with the 2016 US presidential election and the Brexit campaign:

Facebook’s handling of personal data, and its use for political campaigns, are prime and legitimate areas for inspection by regulators, and it should not be able to evade all editorial responsibility for the content shared by its users across its platforms.

Facebook: Bring it!

According to the BBC, Facebook welcomed the committee’s report and said it would be open to “meaningful regulation”.

Facebook also patted itself on the back for having participated in the investigation:

[Facebook shares] the committee’s concerns about false news and election integrity and are pleased to have made a significant contribution to their investigation over the past 18 months, answering more than 700 questions and with four of our most senior executives giving evidence.

The committee begs to differ on those claims. Committee Chair and MP Damian Collins said that Facebook did not fully cooperate with the investigation. The BBC quoted him:

We believe that in its evidence to the committee, Facebook has often deliberately sought to frustrate our work, by giving incomplete, disingenuous and at times misleading answers to our questions.

That’s nothing new: the committee has found Facebook to have been less than cooperative for some time. Over the course of the UK investigation, Facebook steadfastly refused to appear before MPs to explain the company’s moves with regards to fake news.

That’s actually what led the committee to try to get the information it sought from another source: namely, from a lawsuit brought against Facebook by the tiny, your-Facebook-friends-in-bikinis-centered developer Six4Three. Six4Three has been wrangling with Facebook in US court since 2015 over allegations that the platform turned off the Friends data API spigot as a way of forcing developers to buy advertising, transfer intellectual property or even sell themselves to Facebook at bargain basement prices.

In December, Parliament’s fake news inquiry got its fingers into the legal Six4Three pie and came out with a fistful of Facebook staff’s private emails, which it then published.

Six4Three has alleged that the correspondence shows that Facebook was not only aware of the implications of its privacy policy, but actively exploited them. Collins and his committee were particularly interested in the app company’s assertions that Facebook intentionally created and effectively flagged up the loophole that Cambridge Analytica used to collect user data.

According to the report released on Sunday, the Six4Three court documents indicate that Facebook was…

…willing to override its users’ privacy settings in order to transfer data to some app developers, to charge high prices in advertising to some developers, for the exchange of that data, and to starve some developers – such as Six4Three – of that data, thereby causing them to lose their business.

At the very least, that would mean that Facebook violated its 2011 settlement with the Federal Trade Commission (FTC), the report said. The FTC had found that Facebook failed to protect users’ data and had let app developers gain as much access to user data as they liked, without restraint: an outcome of how Facebook had built the company so as to make data abuses easy, the FTC had found.

The report noted that the ICO had told the committee that Facebook needs to “significantly change its business model and its practices to maintain trust.” Yet the Six4Three documents show that Facebook “intentionally and knowingly violated both data privacy and anti-competition laws,” the report said, and thus the committee is calling on the ICO to carry out a detailed investigation into Facebook’s practices around the use of users’ data and users’ friends’ data.

”Democracy is at risk”

Cambridge Analytica is a case study in how politics have intersected with access to Facebook’s fat data APIs. It was a web analytics company started by a group of researchers with connections to Cambridge University in the UK.

The firm collected user data without permission in order to build a system that could profile individual US voters so as to target them with personalized political ads. Its researchers got at the data via a Facebook personality test called thisisyourdigitallife that billed itself as “a research app used by psychologists.”

Besides analyzing the effects of fake news, at the heart of the UK inquiry was how much of that data was used for political campaigning. The report’s conclusion:

Democracy is at risk from the malicious and relentless targeting of citizens with disinformation and personalised ‘dark adverts’ from unidentifiable sources, delivered through the major social media platforms we use every day.

The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data privacy rights.

The report is calling for:

  • A compulsory code of ethics for tech companies, overseen by an independent regulator, to set out what constitutes harmful content.
  • The regulator to be given powers to launch legal action if companies breach the code, including, potentially, large fines.
  • The government to reform current electoral laws and rules on overseas involvement in UK elections.
  • Social media companies to be forced to take down known sources of harmful content, including proven sources of disinformation.
  • Tech companies operating in the UK to be taxed to help fund the work for the ICO and any new regulator set up to oversee them.

Facebook denies having broken any laws in the country. Karim Palant, public policy manager for Facebook in the United Kingdom:

While we still have more to do, we are not the same company we were a year ago.