In brief: YouTube might have implemented a number of measures to tackle toxic and conspiracy videos in recent times, but it hasn't always been in such a hurry to do so. According to a new report, the company ignored employee warnings to address this type of content as it wanted to boost viewer engagement.

Bloomberg spoke to 20 current and former YouTube workers who said employees had offered suggestions on what to do about "false, incendiary and toxic" videos over the last few years. But executives didn't want to stop recommending them to viewers or tracking their popularity, fearing it would negatively affect engagement numbers.

One idea was to stop recommending those videos that were troubling but didn't violate the site's policies on hate speech. The company rejected the plan in 2016, and only decided to implement something similar in January this year.

Lawyers advised YouTube employees who weren't on the moderation teams not to search for toxic videos. If there were no proof that the staff knew this content existed, the firm would be less liable.

One person said that before Google acquired YouTube in 2006, the site was quick to address problematic videos, but that changed following the takeover, and the company started putting profit above all else.

Things became so bad that five senior workers left the company over YouTube's "inability to tame extreme, disturbing videos." One former employee said CEO Susan Wojcicki never dealt with the content and only focused on running the company. Some called the problem "bad virality."

A YouTube spokesperson said the company started making changes to its recommendation algorithm in 2016 and demonetized channels that hosted toxic videos a year later.

While disturbing videos remain on the site, YouTube is doing more to fight back, removing 8.8 million channels for violating its guidelines in Q4 2018 alone.