YouTube will turn off comments on all videos that contain young children, the company says, as well as a number of other enforcement actions designed to stave off an advertiser boycott sparked by the discovery of an organised paedophile ring operating in plain sight on the video-sharing platform.
The company will disable all comments on videos featuring younger children, and will also disable comments on those videos of older children that have some risk of attracting predatory behaviour, YouTube says.
It has also prioritised the launch of an AI moderator that is “more sweeping in scope, and will detect and remove two times more individual comments” than its predecessor, in an attempt to identify and remove predatory comments before they can cause harm.
The new policies follow the discovery last week of a paedophile ring that used the platform to find and share clips of videos featuring young children in states of undress. The group, discovered by YouTuber Matt Watson, would post comments in videos of young girls doing exercises, dancing, or performing gymnastics, often adding details like the time stamps at which underwear was visible.
YouTube’s recommendation algorithm was even co-opted by the group: after a viewer watched enough videos favoured by the ring’s members, the algorithm would automatically start to recommend other videos featuring young children.
“Paedophiles are trading social media contacts; they’re trading links to actual child porn in YouTube comments; they’re trading unlisted videos in secret, and YouTube’s algorithm through some glitch in its programming is facilitating their ability to do this,” Watson said at the time.
The discovery prompted an advertiser boycott of the platform, as companies including Fortnite-maker Epic Games and Nestlé pulled their ads from the site.
YouTube hopes that it will be able to stop any further harm by largely ending the possibility of commenting on all videos of young children, except for a few approved accounts.
“We recognise that comments are a core part of the YouTube experience and how you connect with and grow your audience,” said YouTube in a blogpost addressing affected creators. “At the same time, the important steps we’re sharing today are critical for keeping young people safe. Thank you for your understanding and feedback as we continue our work to protect the YouTube community.”
YouTube has also taken action on a second, unrelated, child protection concern, banning a number of accounts that “attempted to endanger children” by splicing shocking content intended for adults in with cartoons and other content intended for children.
One such edit featured a YouTuber named Filthy Frank, who had produced an originally satirical clip “advising” children on how to kill themselves. That scene, edited into a children’s cartoon, ended up not only being seen by children but even made it past the automated filters on to YouTube’s own YouTube Kids channel.