Site icon Sophos News

YouTube to crack down on inappropriate videos targeting kids

What exactly are we to make of videos that are targeted at kids but that feature a) cartoon characters that turn into monsters and try to feed each other to alligators, b) a Claymation Spiderman urinating on Elsa of “Frozen”, or c) Nick Jr. characters in a strip club?

The lesson is that human-free, automatic filters that are supposed to strip out any YouTube content that’s not child-friendly, so that it can be streamed on what’s supposed to be the kid-safe YouTube Kids site, are far from foolproof.

Earlier in the month, the New York Times reported that a startling collection of disturbing videos are slipping past the algorithms erected to keep out bad actors on YouTube Kids.

The NYT included links to several videos, including those listed above, most of which have since been removed. One exception that’s still up is a live-action video showing parents rough-housing with their daughter, featuring a scene in which the parents shave the young girl’s forehead, causing her to wail and to apparently bleed.

This isn’t how it was supposed to be. In February 2015, YouTube announced it was launching a kid-friendly zone: one in which youngsters would be spared the hair-raising comments that have turned YouTube into a fright fest.

According to the NYT, parents and kids have flocked to YouTube Kids since then. It’s pulling in more than 11 million weekly viewers, attracted by a seemingly bottomless barrel of clips, including those from kid-pleasing shows by Disney and Nickelodeon. Those viewers include parents who assume that on YouTube Kids, their kids will only see age-appropriate content that’s been scrubbed of the muck that you can find on the main YouTube site.

That muck includes content that can be racist, anti-semitic, homophobic, sexual, or horrifically violent. Some YouTube content denies the Holocaust or seeks to justify terrorism or crimes against humanity. Such content has, in fact, put YouTube in the cross-hairs of groups that are seeking legal sanctions against not only YouTube but its social media brethren Facebook and Twitter, in countries such as France or Germany, where the content is illegal.

Who is evading YouTube Kids’ filters to target children with these bizarre videos? And why are they doing it? To what end does someone want to feature Mickey Mouse lying in the street in a pool of blood as Minnie Mouse looks on?

Apparently, the NYT reports, there’s money being made. These bizarre videos come laced with automatically placed ads. That means that both the creators and YouTube are profiting.

But YouTube’s been fighting it: in the months leading up to August, it announced that content creators could no longer make a tidy profit from inappropriate use of family-friendly characters (bye-bye, Spiderman urinating on Elsa).

As the NYT reports, the videos are “independently animated, presumably to avoid copyright violations and detection.” Some recently uploaded clips have millions of views on the main YouTube site. It’s not clear how many of those views came from YouTube Kids.

YouTube is both addressing the problem and trying to minimize it. It claims that the fraction of videos on YouTube Kids that were missed by its algorithmic filters and then flagged by users during the last 30 days amounted to a miniscule 0.005% of videos. According to The Verge, YouTube has dismissed reports that inappropriate videos racked up millions of views on YouTube Kids without being vetted. The company says that those views came from activity on YouTube proper, which makes clear in its terms of service that it’s aimed at users who are at least 13 years old.

YouTube last week announced that it’s implementing a new policy that age-restricts this type of content in the main YouTube app when flagged. Juniper Downs, YouTube’s director of policy, told The Verge that “Age-restricted content is automatically not allowed in YouTube Kids.” The policy has been in the works for a while, so the rollout isn’t a direct response to recent coverage of the inappropriate content, YouTube says.

The first step in keeping this stuff off YouTube Kids are the algorithmic filters. Then, YouTube says it has a team of humans reviewing flagged videos. If a video flagged in the main app contains children’s characters, it will be sent to the policy review team. YouTube says it has thousands of people, including volunteers, working to review flagged content around the clock, around the world. If the review concludes that a video violates the new policy, it will be age-restricted, automatically blocking it from showing up in the Kids app.

YouTube is hoping that within the window of time between content making its way from YouTube onto YouTube Kids – a matter or days – users will flag clips that could potentially disturb children. The Verge reports that YouTube will soon start training its review team on the new policy, which should be live within a few weeks.


Exit mobile version