On Thursday, YouTube announced on its creator blog that it’s disabling comments on millions of videos featuring minors, in response to reports that creeps are leaving disgustingly sexual comments on videos featuring kids doing things like yoga or gymnastics, or playing games such as Twister.
As content creator Matt Watson had documented a week before, such comments sections had what he called a “wormhole.” Within as few as five clicks, you could find yourself in a “soft-core pedophilia ring” where child oglers leave sexual comments and connect with each other in the comments sections of innocuous videos featuring children, sharing contact information or, sometimes, links to actual child abuse imagery.
The news caused a mob of advertisers to flee, as big brands such as Disney, Fortnite maker Epic Games, GNC and Nestle pulled their ads.
YouTube said on Thursday that over the prior week, the platform had disabled comments on tens of millions of videos that could be “subject to predatory behavior.” Over the coming months, it also plans to suspend comments on videos featuring “young minors” and those featuring older minors that “could be at risk of attracting predatory behavior.”
It’s not shutting down comments on all such videos: YouTube said that a small number of creators will be able to keep comments enabled, though they’ll be required to actively moderate comments and demonstrate a low risk of predatory behavior. YouTube says it’s going to work with such creators directly and hopes that their numbers increase as it works on improving its ability to catch violative comments.
Predator filter will remove 2X more comments
YouTube said that it’s been removing hundreds of millions of violative comments, but it’s been working on an even more effective classifier to specifically sweep up predatory comments. It’s sped up the launch of the classifier, which doesn’t affect video monetization and which is supposed to detect and remove 2X more individual comments.
YouTube said that it’s also removed thousands of inappropriate comments on videos showing minors and has terminated hundreds of viewer accounts for their comments. In addition, it’s reported illegal behavior to the NCMEC so they can work with the proper authorities.
Content creators cry foul
The exodus of big businesses has led some creators to label the incident “Adpocalypse 2.0”.
Adpocalypse 1.0 happened between March-May 2017, when major advertisers such as AT&T and Johnson & Johnson yanked ads that were appearing on YouTube videos that espoused extremism and hate speech. In response, YouTube abruptly rolled out changes to its automated processes for placing ads across the platform – a move that was considered punitive against creators whose material wasn’t racist or sympathetic to terrorists.
Nearly two years later, content creators again feel like they’re being punished after they’ve done nothing wrong. You’ve got the technology to do more than just blame content creators for this problem, said Twitter user Tay Zonday in response to Google’s announcement:
Others have criticized Watson over his repeated calls for big advertisers to pull out from YouTube and for consumers to boycott them – something he pushed in five videos he uploaded after the initial one.
One critic was Daniel Keem, also known as Keemstar, the host of YouTube channel DramaAlert, who said that a second Adpocalypse wouldn’t stop child abusers:
If advertisers leave YouTube, this isn’t going to stop the pedos in the comments section. This is just going to hurt the livelihood of YouTubers big and small.