Video sharing giant YouTube has decided to disable the comment option on all videos featuring young children. The announcement comes after it was discovered that a large online community of pedophiles were time-stamping videos of children, referring other sick individuals to the exact video frames that captured children in compromising positions.
Following the sordid discovery, a number of large companies, such as Fortnite-maker Epic Games, Nestlé and Disney, pulled their advertisements from the platform, prompting YouTube to take immediate action to remedy the situation.
“Recently, there have been some deeply concerning incidents regarding child safety on YouTube,” wrote company CEO Susan Wojcicki on Twitter. “Nothing is more important to us than ensuring the safety of young people on the platform. More on the steps we’re taking to better protect children & families.”
In a blog post published Feb. 27, the company stated that it had “disabled comments from tens of millions of videos that could be subject to predatory behavior.”
In addition, the multi-billion-dollar platform stated that it would be “broadening this action to suspend comments on videos featuring young minors and videos featuring older minors that could be at risk of attracting predatory behavior.”
YouTube noted that a “small number of creators” would be allowed to keep comments enabled on these types of videos, but will be required to “actively moderate” the comments posted to their videos.
The platform also announced a widespread crackdown on channels that pose a risk to minors.
“No form of content that endangers minors is acceptable on YouTube, which is why we have terminated certain channels that attempt to endanger children in any way,” the company continued. “Videos encouraging harmful and dangerous challenges targeting any audience are also clearly against our policies. We will continue to take action when creators violate our policies in ways that blatantly harm the broader user and creator community.”
How was the pedophile ring discovered?
In a 20-minute video viewed more than 2 million times since being uploaded Sunday, blogger Matt Watson explained how the video-hosting website features a bug that operates as a “wormhole” for pedophilic content.
Watson demonstrated, through screenshots and recordings, that if users click on one of the videos with underage girls, most of whom are seen doing gymnastics or posing in compromising positions, YouTube’s algorithm will then flood their computer screens with recommendations for similar clips.
“I can consistently get access to [the videos in question] from vanilla, never-before-used YouTube accounts via innocuous videos in less than 10 minutes, in sometimes less than five clicks,” he explained.
At the time of writing, Watson’s investigative video has received over 3 million views.