YouTube is bringing on more human moderators to review its videos
Humans will attempt to end wrongly flagged videosBy Greg Synek
YouTube has had some problems with its video flagging system over the past few months that it just can't seem to iron out. From demonetizing videos for unclear reasoning and displaying inappropriate content to kids, while leaving obviously policy-violating videos up for several hours at a time, there is significant room for improvement.
Despite actions being taken to help fight against terrorism on the platform, YouTube is still struggling to revise its automated system for regulating uploaded content. In order to remain ahead of the problems plaguing the video giant, YouTube has plans to increase the number of human content reviewers to over 10,000 employees in 2018.
Next year, human reviewers will help remove hundreds of thousands of videos flagged by the community and software. As of today, 98% of removed YouTube videos taken down for extreme violence are detected by machine learning algorithms. Approximately half of all videos with extremist content are taken down within two hours of being uploaded. Up to 70% of extremist videos are handled within eight hours. Unfortunately, that is still enough time to influence a large audience and be a danger to the public.
In addition to adding more human reviewers, YouTube will also consult with industry groups and academic institutions on trending topics to evaluate up and coming risks.
Over the next year, YouTube hopes to greatly reduce the potential of abuse on its platform and has committed to being more open about the content that is flagged. Regular reports will be published about actions taken on videos violating content standards.