Site icon Sophos News

YouTube conspiracy videos to get links to Wikipedia and other sources

Were the US moon landings faked? Did director Stanley Kubrick rig the astronauts up with theatrical wires in a movie studio and bounce them up and down to simulate low gravity?
We’re not going there. We’re not going to the moon, and we’re not going to try to talk anybody out of their belief that visual flashes in videos betray the wires. But YouTube is – at least, it’s getting ready to put a bit more context around such content.
Reuters reported on Tuesday that YouTube – a unit of Google’s Alphabet – is planning to slap excerpts from Wikipedia and other websites onto pages containing videos about hoaxes and conspiracy theories, such as the ones relating to moon landings.
YouTube CEO Susan Wojcicki delivered the news at the South by Southwest Conference (SXSW) in Austin, Texas, on Tuesday. She displayed a mock-up of the new feature, which will be called information cues.
Wojcicki said that the videos slated to get this treatment won’t go away. They’ll just be accompanied by additional sources:

People can still watch the videos but then they actually have access to additional information, can click off and go and see that.

The information cues won’t appear on all controversial videos. Engadget reports that at least at first, the cues – including a text box linking to a third-party source such as Wikipedia – will only appear around videos regarding conspiracies that have “significant debate.”
Here’s a statement sent out by a YouTube spokesperson:

We’re always exploring new ways to battle misinformation on YouTube. At SXSW, we announced plans to show additional information cues, including a text box linking to third-party sources around widely accepted events, like the moon landing. These features will be rolling out in the coming months, but beyond that we don’t have any additional information to share at this time.

This is only one approach out of many that major content platforms such as Google and Facebook have presented, all in response to lawmakers and media advocacy groups asking for their help to battle hoaxes and fake news.


Google did something similar in April, putting Fact Check tags, gleaned from a fact-checking community of 115 organizations, on some of its search and news results in order to add additional information.
Both Facebook and Google have tried pushing down potentially fake content in their news rankings. Facebook has also tried sticking disputed flags onto what some of us call fake news and what others call the stories that mainstream news outlets with hidden agendas want to suffocate. It subsequently mothballed the tags after admitting they hadn’t done squat to stop the spread of fake news.
As Engadget points out, YouTube not only hosts and displays videos that push extreme conspiracies. Its algorithm also suggests related videos and thereby can push the craziest content to the top of rankings, furthering its spread and giving creators incentive to churn out similar content.
A case in point was a video that accused Parkland High School shooting survivors of having been coached to play the part of “crisis actors” – a video that top-trended last month.
Will the information cues lessen such algorithm-boosted dissemination? It would be nice to think so, but a similar approach didn’t work very well for Facebook. Recent research has shown that people relish fake news. It’s so much more colorful than the plain old humdrum truth.
Good luck turning that around, YouTube.


Exit mobile version