Why it matters: After an exposé video by Matt Watson went viral, YouTube has taken several measures to curbs pedophiles from exploiting the platform — a problem that has pervaded the service since at least 2017.
YouTube announced on Thursday that it would be disabling comments on videos featuring children. The move comes after a “soft-core pedophilia ring” was exposed last week.
Former YouTuber Matt Watson angrily exposed an organized group of child predators in a video he said would be the last on the platform. The video quickly went viral earning more than 3 million views in less than a week. In a reaction, companies like Epic Games, Peloton, Disney, and others began pulling ads. YouTube responded to the backlash by deleting the offending comments as well as the associated accounts.
However, it soon became clear that the problem was more significant than anyone even suspected. In less than two days, YouTube had deleted over 400 channels owned by alleged predators and “tens of millions” of their comments. It then began disabling comments on videos of regular users that had been exposed to the predatory behavior.
"These efforts are focused on videos featuring young minors and we will continue to identify videos at risk over the next few months."
Today, YouTube outlined its plan for future action. In an address to content creators, it said it would continue suspending comments on “at risk” videos containing "young minors," but it would also be expanding the effort to include videos featuring older children as well.
“Over the past week, we disabled comments from tens of millions of videos that could be subject to predatory behavior. These efforts are focused on videos featuring young minors and we will continue to identify videos at risk over the next few months. Over the next few months, we will be broadening this action to suspend comments on videos featuring young minors and videos featuring older minors that could be at risk of attracting predatory behavior.”
YouTube acknowledged that this might affect the way some channels connect with their audience, but its intention and priority is to keep young people safe. The platform will allow a small number of creators that have this type of content to keep their comments on, but they will be required to “actively moderate” the channel and show that their videos are at low risk of attracting such comments.
"This classifier will detect and remove 2X more individual comments [and] does not affect the monetization of your video."
YouTube also said that it had accelerated the rollout of an AI algorithm that will be used to classify and delete offending comments. It claims the new classifier will be able to detect and remove twice as many comments as the algorithm currently used.
It is good to see YouTube finally taking action on this matter. The Times of London exposed this problem clear back in 2017. One would think that the current ad boycott had something to do with YouTube's decisive action, but several companies including Adidas, Deutsche Bank, eBay, Amazon, and others pulled advertising in 2017 with little effect. So attributing the turnaround to lost ad revenue is not so cut and dry.
We reached out to parent company Google for comment, but have not heard back. We will update if it provides a statement.