The YouTube for Kids app, a supposedly family-friendly version of Google’s video-sharing site, has been slammed for allowing disturbing content to get past its filters. Both the New York Times and writer James Bridle report that the bizarre videos, often featuring knockoffs of well-known children’s characters, could potentially traumatize young viewers.
Introduced back in 2015, the YouTube for Kids app lets children browse channels and playlists from categories that include shows, music, and learning. The app was revamped with new features and improved security last week. When announcing the update, Google warned that while systems are in place to prevent certain material making its way onto the app, some mature content could get through as the videos aren’t manually reviewed. It advised parents who discovered anything unsuitable for kids to block and report it.
Algorithms decide whether videos uploaded to YouTube are suitable for YouTube for Kids. YouTube's global head of family and learning content, Malik Ducard, told the NYT that the approved videos are continually monitored, a process that is “multilayered and uses a lot of machine learning.”
But, as we’ve seen before, algorithms aren’t perfect. Some of the videos that got through included one of Mickey Mouse lying in a pool of blood after being hit by a car, as Minnie Mouse watches. There was also a claymation version of Spider-Man urinating on Princess Elsa from Frozen. There were plenty of bizarre rip-offs of popular kids’ shows, too, including the “PAW Patrol” characters in a strip club, as well as some pretty bewildering CGI clips.
The problem isn’t widespread; in the last 30 days, less than .005 percent of videos viewed on the app have been removed for being inappropriate, and the company is trying to reduce that figure further.
"We use a combination of machine learning, algorithms and community flagging to determine content in the app as well as which content runs ads. We agree this content is unacceptable and are committed to making the app better every day," said a YouTube spokesperson.
Whether it's solely the fault of the algorithms, or trolls purposely trying to trick the system, parents and children's groups are angry about the situation.
“Algorithms are not a substitute for human intervention, and when it comes to creating a safe environment for children, you need humans,” Josh Golin, executive director of the Campaign for a Commercial-Free Childhood, told the Times.