TechSpot means tech analysis and advice you can trust. Read our ethics statement.
Many people head to YouTube to catch up on the latest videos from their favorite channels, but the site tries to hang on to viewers through its recommended videos section. These personal playlists are responsible for 70 percent of YouTube's viewing time, but they often contain extreme or misleading content, even when users haven't shown an interest in such material.
With 1.5 billion users, more people watch YouTube than the number of households that own televisions. The Wall Street Journal reports that almost three-quarters of the videos it shows are decided by an algorithm, which is part of what company engineers describe as the "largest scale and most sophisticated industrial recommendation systems in existence."
But these recommendations often lead to some of YouTube's more controversial channels, including those that feature conspiracy theories, partisan viewpoints and misleading videos. These often appear even when users don't have a history of watching similar content.
YouTube owner Google has tried to tackle the problem by adding more human moderators, removing flagged videos, and de-monetizing the channels that create them, but the algorithm keeps surfacing clips that are already bringing in a high volume of viewers and are likely to keep people on the site. These are often sensationalist and border on the extreme, according to engineers.
"YouTube is something that looks like reality, but it is distorted to make you spend more time online," former Google worker Guillaume Chaslot told the Guardian. "The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy."
YouTube altered its algorithm last October when searches on the Las Vegas shooting brought up videos claiming it was a government conspiracy. Those looking for breaking news stories have since been presented with "more authoritative" news sources, according to executives. Last week, the company said it was planning changes that would see all state-sponsored content labeled. Additionally, it is considering surfacing relevant videos from credible news sources alongside conspiracy theory videos.
The WSJ writes that recent tests show "news searches in YouTube return fewer videos from highly partisan channels," but many searches are still returning conspiracy videos.