Up until September 2017, when Facebook handed Congress details of thousands of propaganda ads placed by a Russian troll farm to sow political strife, CEO Mark Zuckerberg dismissed the idea that fake news on Facebook influenced last year’s election. It’s “a pretty crazy idea,” he scoffed.
Yeah, he would say that, according to former Facebook privacy problem fixer and Operations Manager Sandy Parakilas: because Facebook doesn’t give a flying Farmville fig about protecting users from abuse.
What it cares about is data collection it can sell to advertisers, he said in a scathing opinion piece published by the New York Times on Monday.
What I saw from the inside was a company that prioritized data collection from its users over protecting them from abuse. As the world contemplates what to do about Facebook in the wake of its role in Russia’s election meddling, it must consider this history. Lawmakers shouldn’t allow Facebook to regulate itself. Because it won’t.
The editorial was titled “We Can’t Trust Facebook to Regulate Itself,” and that about sums up Parakilas’ take on the matter: an opinion based on having led Facebook’s efforts to fix privacy problems on its developer platform in 2011 and 2012, leading up to the company’s 2012 initial public offering.
You don’t pay for your Facebook account, at least not in terms of cash, but of course you pay when you interact with it: that’s when you become the product that it sells to marketers. Facebook gets more than 1 billion visitors in a day, Parakilas says: no wonder it’s blown up into a $500 billion behemoth in the five years since its IPO (Initial Public Offering – when a company goes public on a stock exchange), he says.
Given that kind of data-fueled, money-making brawn, unless regulators come knocking or angry mob users come bearing torches and furious headlines, Facebook just doesn’t care what happens to our data, he says:
The more data it has on offer, the more value it creates for advertisers. That means it has no incentive to police the collection or use of that data – except when negative press or regulators are involved. Facebook is free to do almost whatever it wants with your personal information, and has no reason to put safeguards in place.
Parakilas points to the golden years of addictive social games that thrived on Facebook’s developer platform: think Farmville and Candy Crush. (Remember all the maddening, non-stop invitations from friends that made you want to throw your laptop into a bubble bath with the power cord attached? Yes, those years).
Users didn’t have to pay for the addictive games – except with their data, of course.
The problem was that there were no protections around that data, Paralikas says. The data flowed through Facebook out into the eager hands of developers, and Facebook didn’t do much to stop any abuse that happened to it after that. Not, that is, until the impending IPO grew nearer and the media started piping up about how the data was being misused, Paralikas said. That’s when he got tasked with solving the problem.
He found that the issue was, shall we say, not on the front burner. In fact, nobody at Facebook was checking up on those developers at all. Paralikas says that at one point, it looked like a developer was using Facebook data to generate profiles of kids, without consent. The app developer’s response: we’re not violating Facebook policies on data use! Paralikas’s discovery: it was an unverifiable claim, because nobody was checking up on developers and how they handled data. “We had no way to confirm whether that was true,” he said.
Once data passed from the platform to a developer, Facebook had no view of the data or control over it. In other cases, developers asked for permission to get user data that their apps obviously didn’t need – such as a social game asking for all of your photos and messages. People rarely read permissions request forms carefully, so they often authorize access to sensitive information without realizing it.
Has Facebook’s attitude to privacy improved since the time that Parakilas worked there and the company went public?
Maybe, but as recently as September 2017, Facebook was hit with a privacy fine because of its missteps.
Spain penalized Facebook to the tune of €1.2m for privacy violations: the Spanish regulator, AEPD, said that the social media giant hadn’t gained adequate user consent for how it collects, stores and uses data for advertising, and found two serious infringements and one very serious infringement.
The total fine of €1.2m was made up of €600,000 for the very serious breach plus two charges of €300,000 for the two lesser infringements.
The AEPD said that Facebook had kept information for more than 17 months after users had closed their accounts, and also said that “the social network uses specifically protected data for advertising, among other purposes, without obtaining users’ express consent as data protection law demands – a serious infringement”.
So, what does all this add up to? Is Facebook as bad as it was in 2011?
Regardless of its progress or lack thereof, is Paralikas right when he says that Facebook can’t be trusted with regulating itself so that Russia doesn’t use it for a springboard to spread fake news?
What do you think?
Anonymous
Leaving the platform for good.
Marvin
Follow the money – go for those who advertise on Facebook
Without advertisers they would be as lost as they would be without users
The advertisers are just as complicit in the data abuses
Jim
Is anybody surprised by this? Heck, it would surprise me if they DID care about abuse (beyond the headliners).
Tracy
I have to agree with Jim. Money is the only driving force behind Facebook and all of the other Social Media outlets. The old saying that there is nothing free in this world is proven time and again. We are told that Facebook is and always will be free to use but look behind the smoke and mirrors. What you will find is someone behind the curtains counting the money.
Will the Government get involved in regulating Facebook? It will appear that they will but in actuality it will just be more smoke and mirrors. The only solution to the problem would be for Users to quit Facebook in mass. How would that affect the economy? How much money is being made by third parties using data gathered by Facebook? I would say the genie is out of the bottle and there is no going back. As for the question of whether Facebook will regulate itself. I think Facebook will do whatever it can to make money ethics be damned.
Thank you Miss Vaas. Nice article.
catcoode
If you are not paying for the product then you are the product. What I find most disturbing isn’t the fact that social networks have so much data but rather that many users are in the dark. So many people blindly trust that an app will do what it claims and no more. They shocked to learn what contact lists can be used for or that many apps can track you via geotagging.