Site icon Sophos News

Zuckerberg pushes back on fears over fake news on Facebook

It’s a “pretty crazy idea” to think that Facebook news stories influence people, Mark Zuckerberg said last week, even as he conceded that perhaps fake news might have had the unintended effect on people some commentators have claimed.

He wasn’t speaking to Facebook advertisers, of course, though some were quick to point out that his tongue seemed to have forked:

No, he wasn’t talking to ad-buying customers. Rather, Zuck was speaking to a crowd at the Techonomy conference near San Francisco, where he dismissed the  hubbub about fake news swaying voters. Said voters in the US, of course, had shocked much of the world by voting for Donald Trump in the presidential election.

The Guardian quoted Zuckerberg, who said that it’s a disservice to Trump supporters to suggest they’d fall for all the hoaxes and half-truths that were oozing around cyberspace in the run-up to the election:

Voters make decisions based on their lived experience. There is a profound lack of empathy in asserting that the only reason someone could have voted the way they did is because they saw fake news.

Funny, that’s not what Facebook thinks about potato chips.

If you scroll through its gallery of targeted marketing success stories, you’ll find that Facebook boasts about advertiser Lay’s, purveyor of crisps, having convinced more people into eating new potato chip flavors through use of video ads.

…and that the maker of an anti-blister spray for high-heel wearers convinced people to part with their foot-spray money to the extent a whopping 30X sales increase.

Like all advertising platforms, Facebook’s full of these stories, page after page of them, all cheerfully extolling how advertisers “connected” with consumers and picked the precise kind of marketing to elicit the most response from their targets (example: high heel wearers like “raw video footage” featuring an enthusiastic foot-spray user more than they like a polished presentation).

And Facebook, again like all advertisers, is obviously pumped up about its power to drive decision-making. Granted, swaying voters’ decisions in a presidential election isn’t exactly the same thing as convincing them to buy Kettle Cooked Indian Tikka Masala chips.

Nonetheless, it’s seems that there is some cognitive dissonance at work for Facebook to flog its ability to influence consumers as an advertising platform out of one side of its face, then to spin around 180 degrees, shrug its shoulders and then to say in effect: “Meh! We can’t influence people’s political viewpoints. We’re just a platform.”

“We are a tech company, not a media company,” Zuckerberg has said repeatedly over the last few years. It builds the tools and then steps back, insisting that it doesn’t bear any of the responsibilities of a publisher for verifying information.

Many commentators say that’s disingenuous: Facebook might not write or be responsible for commissioning phony or misleading news, but its algorithms parse how articles are shared and prioritise showing them to users according to criteria that don’t distinguish between fake news and real news.

We don’t know much about Facebook’s proprietary algorithm, but we do know that it takes into account how close you are to the poster, plus how many times a post has been liked or shared, as it parses which stories to push into the Trending panel.

During a heavy news cycle such as the bruising US presidential campaign, fake news that picked up viral-share speed (not just on Facebook, of course) included some doozies: one was a hoax about an FBI agent connected to Hillary Clinton’s email disclosures having murdered his wife and shot himself (from a purported newspaper called “The Denver Guardian” that doesn’t actually exist); or the report that President Obama and Hillary Clinton had both promised amnesty to undocumented immigrants who vote on the Democratic ticket.

Facebook already knows it’s got issues with how baloney spreads virally. From a statement it sent to media outlets last week concerning its efforts to filter the dreck out of Trending:

…we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation. We’re committed to continuing to work on this issue and improve the experiences on our platform.

Unfortunately, its algorithm simply doesn’t distinguish between something that’s coming from a reputable source vs. something that’s merely pretending to be a reputable source. A good example of that is the Denver Post’s unraveling of that fake Denver Guardian article.

How phoney was that phoney newspaper? It couldn’t even pick a real address when it lied about where its newsroom is supposed to be. Instead, it listed the address of  a tree in a parking lot next to a vacant bank building.

A story doesn’t have to be legit to make it into people’s newsfeeds or into the Trending panel. It just has to look legit, like the UK website Canary, which published a story claiming that US election results had been published a week before the polls closed. It fell to blogger David Landon Cole to point out how badly wrong that claim was in a blog post of his own.

Facebook’s commitment to the responsibilities of being a publisher was questioned when it axed the humans whose purpose it was to curate Trending topics and weed out such fake news. Within three days of swapping the editorial staff for an algorithm, Facebook had published a fake news story in Trending topics.

Facebook didn’t learn its lesson after that misstep, though. In the months that followed the hoax about Fox News “exposing” news anchor and so-called “traitor” Megyn Kelly and kicking her out for backing Hillary Clinton, it’s gone on to repeatedly trend fiction passed off as news.

However Facebook sees itself, it is nonetheless where many people go to get their news. According to Pew Research Center, two thirds of US adults – 62% – get news on social media. And that number’s on a growth path: it’s up from 49% in 2012.

One problem that Facebook and thus its users are up against is that there are people and organizations dedicated to creating and spreading viral news in order to profit from the clicks.

One example: on Election Day, Buzzfeed identified Macedonian spammers as being the engine behind more than 100 pro-Trump websites.

The US election may well have added fuel to the fire, but that fire has been burning for a while. Last year the Guardian reported on the hundreds of bloggers paid to flood forums and social networks at home and abroad with anti-western, pro-Vladimir Putin, pro-Kremlin comments.

Not only do we have friends who can’t differentiate between good, credible information from reputable sources and dross, all of them diligently spreading that dross; we also have organized groups actively creating that dross.

It’s impossible to say how much of a part Facebook played in coloring voters’ minds and swaying the election. But more to the point, it would be a relief were Mark Zuckerberg to acknowledge the magnitude of the fake-news industry and to step up and pledge to do something about it – be it rehiring the editorial curatorial team or finding some other way to vet the sources whose nonsense it’s allowing to pollute online discourse.

In the meantime, we’re pretty much on our own. And the challenge is to learn to be more discerning about the information we choose to believe.

Learning how to parse information like that needs to start in schools, and Facebook needs to step up on enabling it by acknowledging its role as the the de facto biggest publisher in the world, and thus strive to make sure that good information from reputable sources isn’t drowned out by toxic nonsense.


Image of Mark Zuckerberg courtesy of Shutterstock.com

Exit mobile version