Skip to content
Naked Security Naked Security

YouTube Kids hasn’t cleaned up its act

YouTube is apologizing, again, uttering the tried-and-true "we have to do better."

How to sharpen knives. Characters from the Paw Patrol cartoon screaming on a burning plane. Images of blood-stained clowns.
Déjà vu!
Yes, the fright fest is back again at YouTube Kids, and YouTube is apologizing, again, uttering the tried-and-true “we have to do better.” An investigation by BBC Newsround is only the latest to find inappropriate content on the Google-owned site.
This is not what YouTube Kids was supposed to be. Launched in February 2015, it was meant to be sanitized: a place where youngsters would be spared the hair-raising comments and content to be found on the rest of YouTube.
But in November, the New York Times reported that a startling collection of disturbing videos were slipping past the algorithms erected to keep out bad actors on YouTube Kids.
As of November, YouTube Kids was pulling in more than 11 million weekly viewers, attracted by a seemingly bottomless barrel of clips, including those from kid-pleasing shows by Disney and Nickelodeon. Those viewers include parents who assume that their kids will only see age-appropriate content that’s been scrubbed of the muck that you can find on the main YouTube site, be it racist, anti-semitic, homophobic, sexual, or horrifically violent.
YouTube said in November that it was hoping that within the window of time between content making its way from YouTube onto YouTube Kids – a matter of about five days – users would flag clips that could potentially disturb children. It also planned to start training its review team on a new policy that age-restricts this type of content in the main YouTube app when flagged.


In the most recent investigation, Newsround investigators found disturbing content on YouTube Kids, like Mickey Mouse characters with guns and children’s characters being injured. In fact, hundreds of disturbing videos have been found on YouTube Kids in recent months.
The videos include unpleasant things happening to various characters, including those from the Disney movie Frozen, the Minions franchise, Doc McStuffins and Thomas the Tank Engine.
Newsround arranged for five children to meet Google’s Katie O’Donovan, a public policy manager for Google who’s responsible for economic and skills policy for the UK. She told the BBC that she was “very, very sorry for any hurt or discomfort” caused by the videos the children saw on YouTube and YouTube Kids.
YouTube told Newsround that it’s implemented a variety of processes to try to block inappropriate material:

We have seen significant investment in building the right tools so people can flag that [content], and those flags are reviewed very, very quickly. We’re also beginning to use machine learning to identify the most harmful content, which is then automatically reviewed.

Once content has been flagged and reviewed, it shouldn’t appear on YouTube Kids. The only people who should be able to view it are those who are signed in and over the age of 18.
Is there in fact just too much content to check?
Well, it’s certainly a challenge, Google says, given that YouTube remains an open platform: it’s uploaded, and straight-away, it’s live. Newsround quoted a Google spokesperson:

It is a difficult balance to get right. It is a difficult environment because things are moving so, so quickly.

Newsround noted that according to YouTube’s own statistics, almost a third of internet users use the platform. They watch a billion hours of video every day.
As regards its responsibility toward checking videos before they go on YouTube Kids, YouTube had this to say:

We have a responsibility to make sure the platform can survive and can thrive so that we have a collection that comes from around the world on there.

That doesn’t quite answer the question, though, does it? Of course, the question could be applied beyond YouTube. As in, what responsibility do we ourselves have to make sure that internet-supplied content is safe for kids, and how much responsibility do we cede to a platform like YouTube, which is awash in both the cute ‘n cartoony, the best kitten videos EVUH, and oh no, Spiderman’s breaking Elsa’s arm again?
Naked Security’s Mark Stockley has written an article, 3 things you can do for your social networks, to make them safer, and just plain nicer. He even blew the dust off a word you don’t hear much nowadays: “netiquette.” Remember that one? Wouldn’t it be nice if we revived it?
Our very own Paul Ducklin has also written an article about the internet, the software that runs on it (bugs and all), and what could be fixed or improved with the goal of keeping young people safe. That one’s titled 3 things your social networks can do for you.
So in honor of being nice and netiquette-respectful of YouTube, let’s just hope that it uses whatever it takes to keep our kids from playing with virtual matches, be it algorithms or armies of humans vetting material around the clock.

2 Comments

we ended removing one item to fix this, removed the wifi password. If the app doesnt work offline, my kids dont get it anymore.

Reply

The key here is “Google”. What do we expect from a massive, greedy billion dollar corporation. They could care less about kids, or any normal people for that matter. Whatever boosts the Google bottom line. Pathetic company.

Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!