YouTube's 'recommended videos' algorithm keeps surfacing controversial content

By midian182
Feb 8, 2018
Post New Reply
  1. Many people head to YouTube to catch up on the latest videos from their favorite channels, but the site tries to hang on to viewers through its recommended videos section. These personal playlists are responsible for 70 percent of YouTube’s viewing time, but they often contain extreme or misleading content, even when users haven’t shown an interest in such material.

    With 1.5 billion users, more people watch YouTube than the number of households that own televisions. The Wall Street Journal reports that almost three-quarters of the videos it shows are decided by an algorithm, which is part of what company engineers describe as the “largest scale and most sophisticated industrial recommendation systems in existence.”

    But these recommendations often lead to some of YouTube’s more controversial channels, including those that feature conspiracy theories, partisan viewpoints and misleading videos. These often appear even when users don’t have a history of watching similar content.

    YouTube owner Google has tried to tackle the problem by adding more human moderators, removing flagged videos, and de-monetizing the channels that create them, but the algorithm keeps surfacing clips that are already bringing in a high volume of viewers and are likely to keep people on the site. These are often sensationalist and border on the extreme, according to engineers.

    “YouTube is something that looks like reality, but it is distorted to make you spend more time online,” former Google worker Guillaume Chaslot told the Guardian. “The recommendation algorithm is not optimizing for what is truthful, or balanced, or healthy for democracy.”

    YouTube altered its algorithm last October when searches on the Las Vegas shooting brought up videos claiming it was a government conspiracy. Those looking for breaking news stories have since been presented with "more authoritative" news sources, according to executives. Last week, the company said it was planning changes that would see all state-sponsored content labeled. Additionally, it is considering surfacing relevant videos from credible news sources alongside conspiracy theory videos.

    The WSJ writes that recent tests show “news searches in YouTube return fewer videos from highly partisan channels,” but many searches are still returning conspiracy videos.

    Permalink to story.

     
  2. OutlawCecil

    OutlawCecil TS Guru Posts: 425   +266

    This article and video are jokes. You watch an official Trump video and you get suggested videos of the opposite? MAYBE because most people still firmly disapprove of Trump and there's simply more videos AGAINST Trump than for him. I doubt the algorithm is looking for much beyond "Trump". It's not like it's trying to figure out if you support him or not. It's just Trump videos, most of which are against him, simple as that.
     
    Adorerai likes this.

Similar Topics

Add your comment to this article

You need to be a member to leave a comment. Join thousands of tech enthusiasts and participate.
TechSpot Account You may also...