It’s a “pretty crazy idea” to think that Facebook news stories influence people, Mark Zuckerberg said last week, even as he conceded that perhaps fake news might have had the unintended effect on people some commentators have claimed.
He wasn’t speaking to Facebook advertisers, of course, though some were quick to point out that his tongue seemed to have forked:
Everyone who buys advertising should listen to Zuckerberg saying that Facebook is ineffective at influencing people. https://t.co/hoJvnVX4o5
— Anil Dash (@anildash) November 11, 2016
Facebook is now in the awkward position of having to explain why they think they drive purchase decisions but not voting decisions
— Casey Newton (@CaseyNewton) November 11, 2016
No, he wasn’t talking to ad-buying customers. Rather, Zuck was speaking to a crowd at the Techonomy conference near San Francisco, where he dismissed the hubbub about fake news swaying voters. Said voters in the US, of course, had shocked much of the world by voting for Donald Trump in the presidential election.
The Guardian quoted Zuckerberg, who said that it’s a disservice to Trump supporters to suggest they’d fall for all the hoaxes and half-truths that were oozing around cyberspace in the run-up to the election:
Voters make decisions based on their lived experience. There is a profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news.
Funny, that’s not what Facebook thinks about potato chips.
If you scroll through its gallery of targeted marketing success stories, you’ll find that Facebook boasts about advertiser Lay’s, purveyor of crisps, having convinced more people into eating new potato chip flavors through use of video ads.
…and that the maker of an anti-blister spray for high-heel wearers convinced people to part with their foot-spray money to the extent a whopping 30X sales increase.
Like all advertising platforms, Facebook’s full of these stories, page after page of them, all cheerfully extolling how advertisers “connected” with consumers and picked the precise kind of marketing to elicit the most response from their targets (example: high heel wearers like “raw video footage” featuring an enthusiastic foot-spray user more than they like a polished presentation).
And Facebook, again like all advertisers, is obviously pumped up about its power to drive decision-making. Granted, swaying voters’ decisions in a presidential election isn’t exactly the same thing as convincing them to buy Kettle Cooked Indian Tikka Masala chips.
Nonetheless, it’s seems that there is some cognitive dissonance at work for Facebook to flog its ability to influence consumers as an advertising platform out of one side of its face, then to spin around 180 degrees, shrug its shoulders and then to say in effect: “Meh! We can’t influence people’s political viewpoints. We’re just a platform.”
“We are a tech company, not a media company,” Zuckerberg has said repeatedly over the last few years. It builds the tools and then steps back, insisting that it doesn’t bear any of the responsibilities of a publisher for verifying information.
Many commentators say that’s disingenuous: Facebook might not write or be responsible for commissioning phony or misleading news, but its algorithms parse how articles are shared and prioritise showing them to users according to criteria that don’t distinguish between fake news and real news.
We don’t know much about Facebook’s proprietary algorithm, but we do know that it takes into account how close you are to the poster, plus how many times a post has been liked or shared, as it parses which stories to push into the Trending panel.
During a heavy news cycle such as the bruising US presidential campaign, fake news that picked up viral-share speed (not just on Facebook, of course) included some doozies: one was a hoax about an FBI agent connected to Hillary Clinton’s email disclosures having murdered his wife and shot himself (from a purported newspaper called “The Denver Guardian” that doesn’t actually exist); or the report that President Obama and Hillary Clinton had both promised amnesty to undocumented immigrants who vote on the Democratic ticket.
Facebook already knows it’s got issues with how baloney spreads virally. From a statement it sent to media outlets last week concerning its efforts to filter the dreck out of Trending:
…we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation. We’re committed to continuing to work on this issue and improve the experiences on our platform.
Unfortunately, its algorithm simply doesn’t distinguish between something that’s coming from a reputable source vs. something that’s merely pretending to be a reputable source. A good example of that is the Denver Post’s unraveling of that fake Denver Guardian article.
How phoney was that phoney newspaper? It couldn’t even pick a real address when it lied about where its newsroom is supposed to be. Instead, it listed the address of a tree in a parking lot next to a vacant bank building.
A story doesn’t have to be legit to make it into people’s newsfeeds or into the Trending panel. It just has to look legit, like the UK website Canary, which published a story claiming that US election results had been published a week before the polls closed. It fell to blogger David Landon Cole to point out how badly wrong that claim was in a blog post of his own.
Facebook’s commitment to the responsibilities of being a publisher was questioned when it axed the humans whose purpose it was to curate Trending topics and weed out such fake news. Within three days of swapping the editorial staff for an algorithm, Facebook had published a fake news story in Trending topics.
Facebook didn’t learn its lesson after that misstep, though. In the months that followed the hoax about Fox News “exposing” news anchor and so-called “traitor” Megyn Kelly and kicking her out for backing Hillary Clinton, it’s gone on to repeatedly trend fiction passed off as news.
However Facebook sees itself, it is nonetheless where many people go to get their news. According to Pew Research Center, two thirds of US adults – 62% – get news on social media. And that number’s on a growth path: it’s up from 49% in 2012.
One problem that Facebook and thus its users are up against is that there are people and organizations dedicated to creating and spreading viral news in order to profit from the clicks.
One example: on Election Day, Buzzfeed identified Macedonian spammers as being the engine behind more than 100 pro-Trump websites.
The US election may well have added fuel to the fire, but that fire has been burning for a while. Last year the Guardian reported on the hundreds of bloggers paid to flood forums and social networks at home and abroad with anti-western, pro-Vladimir Putin, pro-Kremlin comments.
Not only do we have friends who can’t differentiate between good, credible information from reputable sources and dross, all of them diligently spreading that dross; we also have organized groups actively creating that dross.
It’s impossible to say how much of a part Facebook played in coloring voters’ minds and swaying the election. But more to the point, it would be a relief were Mark Zuckerberg to acknowledge the magnitude of the fake-news industry and to step up and pledge to do something about it – be it rehiring the editorial curatorial team or finding some other way to vet the sources whose nonsense it’s allowing to pollute online discourse.
In the meantime, we’re pretty much on our own. And the challenge is to learn to be more discerning about the information we choose to believe.
Learning how to parse information like that needs to start in schools, and Facebook needs to step up on enabling it by acknowledging its role as the the de facto biggest publisher in the world, and thus strive to make sure that good information from reputable sources isn’t drowned out by toxic nonsense.
Image of Mark Zuckerberg courtesy of Shutterstock.com
Mahhn
If advertising/propaganda (real or fake) didn’t sway peoples opinion, nobody would pay to do it. In which case Facebook and Google would have no income and not even be footnotes on the internet.
G-Man
“Learning how to parse information like that needs to start in schools” – this is where ‘fake journalism’ starts, the liberal media teaches students how to write gotcha headlines and skew the information to suit their needs. What is lacking in schools is teaching people how to think.
Glen
A lot of things posted on FB are phony stories and some are even gags to see what they can get away with. ( Subscribers). One must read with an OPEN mind, and judge what is good or bad. FB can be useful and at the same time harmful. Just use it as the Social Network it is supposed to be.
Gordon Freeman
If critical thinking is not being encouraged in schools, or at home, then there will be no critical thinkers to decipher what is good information or bad information.
Marvin
Critical thinking is (unfortunately) deemed to be “elitist” and to be deprecated.
It is far more “democratic” to rely on crowd-thinking – after all we are obliged to agree with the majority (whether that is the UK Brexit vote the US electoral college, or X-Factor).
Less brain-strain; leave thinking to people like that master-strategist David Cameron or the populists Farage and Trump.
Matt Parkes
Many of my friends on Facebook are really gullible and do believe everything they read on there to my dismay. People not only need to be taught what is real and what is fake from a young age, how about learning good character, morals and behaviour maybe then people would not be fond of creating this rubbish in the first place and yes parents should be teaching their children these things but it seems to be they are not as more and more parents are children themselves to some degree whether that be in physical age or mental age, I don’t know which is worse.
Anon
The danger is who decides what is true or not. Facebook is full of fake articles we all know this, but there’s also a lot of news that is true which doesn’t get covered by the mainstream media echo chamber, would alternative media be labelled as “Fake” to support the old media companies selling fewer newspapers in the digital age?
Ultimately mainstream media was heavily biased against Trump and they lost, now they are looking to blame facebook.
Marvin
Can’t we just do without Facebook?
Mahhn
If places like CNN and Fox had some journalistic integrity, maybe. But that’s just silly these days.
Marvin
But is FB etc actually just amplifying the effect of CNN & Fox? People take from the likes of them out of context snippets that they “like” and post them on into their groups personal filter bubble.
Jim
Three people related to me were heavily influenced by statements “quoted” on FB. None voted for the candidate these false statements were supposedly from, and they’re all quite intelligent.
So, that’s three votes changed by false stories on FB; go back to grade school, Zuckerberg. By your own statements, you apparently need it.
Jay
People always seem to miss the point that the algorithms are using human interaction. So when you see something “fake” and comment on it, +1. When you share it to your timeline to tell people how dumb they are, +1. When a friend comments on that post, +1. Facebook, nor Google or others, doesn’t care if what you say about something is positive or negative, it just knows it’s popular.
It’s a pretty simple thing to get rid of, stop sharing it, stop interacting with it, mark as spam, unfollow users who share spam (WITHOUT COMMENTING!), and be a citizen of the internet who doesn’t engage with grossly inaccurate drivel.
My timeline on Facebook, as well as my Twitter feed have absolutely 0 misinformation posts and it wasn’t even hard to achieve. I just took control over the algorithm, which is exactly how it was intended to be used in the first place.
Graham Jones
Fake news has always existed. Ask any journalist, they get calls every day saying something has happened, when it hasn’t. In the world of traditional media fake news has always been there, but the editorial process weeds it out. One reason for the existence of editors is to ensure only “real news” gets published and that the non-story and fakery is filtered out.
Just because lots of people like something, share it, recommend it or believe it, does not make it true. Editors sift through all the popularity and the believing in something, to find out if the item really has substance. Sometimes what they do is produce a story that people do not believe (and will not share) even though it is true.
Facebook’s algorithm appears to start with the premise that sharing, popularity and liking something makes it worthwhile. It doesn’t. The entire Facebook news algorithm seems to be based on a false premise.
One day they will realise this and will discover that if they want to “do news” they will need to employ editors; people who can sniff out a fake story within moments and faster than any algorithm. Perhaps what the youthful Mr Zuckerberg really needs to work out is that technology is not always the answer; sometimes it is human experience.
Emily
The thing is, they did have those human editors, but Facebook decided there was too great a risk of bias in the editing so they sacked them and moved to the algorithm method they now use. It seems like it’s a bit of a “damned if they do, damned if they don’t” situation because neither method has worked for them with different faults to each approach.
Marvin
if they want to “do news”
They don’t care whether they want to do news; they want to create connections so that they know more about their stock (information about us) so that it has more value when they sell it on to advertisers.
Cat-videos, news, Rick-rolling, epic-fails are all the same in this world; the propensity to make users create connections is what determines its “value”.