Skip to content
Fake News
Naked Security Naked Security

Facebook’s new fake news strategy is… decide for yourself!

Facebook is getting rid of fact-checkers and leaving it up to readers to decide what's real

Who are these yo-yos who share fake news on social media?
None of your friends, right? Your friends are too smart to fall for cockamamie click bait, and they’re diligent enough to check a source before they share, right?
Well, get ready to have the curtain drawn back. These yo-yos may be us. Or, at least, they may turn out to be our friends and/or relatives.
In its ongoing fight against fakery, Facebook has started putting some context around the sources of news stories. That includes all news stories: both the sources with good reputations, the junk factories, and the junk-churning bot-armies making money from it.
On Wednesday, Facebook announced that it’s adding features to the context it started putting around News Feed publishers and articles last year.
You might recall that in March 2017, Facebook started slapping “disputed” flags on what its panel of fact-checkers deemed fishy news.
You might also recall that the flags just made things worse. The flags did nothing to stop the spread of fake news, instead only causing traffic to some disputed stories to skyrocket as a backlash to what some groups saw as an attempt to bury “the truth”.


In Facebook’s new spin on “putting context” around news and its sources, it’s not relying on fact-checkers. Rather, it’s leaving it up to readers to decide for themselves what to read, what to trust and what to share. At any rate, when it mothballed the “disputed” flags, Facebook noted that those fact-checkers can be sparse in some countries.
So this time around, Facebook said, the context is going to include the publisher’s Wikipedia entry, related articles on the same topic, information about how many times the article has been shared on Facebook, where it’s been shared, and an option to follow the publisher’s page. If a publisher doesn’t have a Wikipedia entry, Facebook will indicate that the information is unavailable, “which can also be helpful context,” it said.
Facebook is rolling out the feature to all users in the US. If the feature has been turned on for you, you’ll see a little “i” next to the title of a news story. It looks like this:

Once you click on that i, you’ll get a popup that shows the Wikipedia entry for the publisher (if available), other articles from the publisher, an option to follow the publisher, a map of where in the world the story has been shared, and, at the bottom, a list of who among your friends has shared it, along with the total number of shares.
Like so:

Facebook says it’s also starting a test to see if providing information about an article’s author will help people to evaluate the credibility of the article. It says that people in this test will be able to tap an author’s name in Instant Articles to see additional information, including a description from the author’s Wikipedia entry, a button to follow their Page or Profile, and other recent articles they’ve published.
The company says that the author information will only display if the publisher has implemented author tags on its website to associate the author’s Page or Profile to the article byline, and the publisher has validated their association to the publisher. The test will start in the US.


7 Comments

Wikipedia is usually a reliable option for determining a source’s reliability, but not always. When Overstock.com’s founder, Patrick Byrne, was making his widely-publicized rants on short-selling, the Wikipedia article whipsawed back and forth about his credibility until it was finally locked by the “winning” side.

Reply

What could possibly go wrong? “Challenge accepted” is echoing throughout the the bot armies.

Reply

It don’t see this having any material impact. The simple fact is that many people do not apply logical reasoning, especially in cases of emotionally charged issues. People believe fake news because they want to believe fake news. They see a meme or a story that confirms their preexisting bias and they run with it; not only accepting it, but spreading it to others. They want it to be true, to prove themselves right.

Reply

they are not getting rid of fact checkers “As of yesterday, we’re fact-checking photos and videos, in addition to links. We’re starting in France with the AFP and will be scaling to more countries and partners soon” [FB newsroom link redacted]

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!