What just happened? The coronavirus (COVID-19) pandemic is hitting the tech industry pretty hard, but those that can work from home have been mostly spared from its worst effects. Unfortunately, it seems YouTube creators might not be in that group for much longer: Google announced today that it will be relying more heavily on its automated content moderation systems until COVID-19 fizzles out.

That may not seem like much of a problem, but YouTubers have already been struggling to cope with the platform's often-unpredictable AI moderation systems for years. These systems will demonetize, block, age-restrict, or otherwise hinder the success of videos for seemingly arbitrary reasons at times. As you can imagine, that's a less-than-ideal situation for those who rely on the site to earn a living.

Regardless, for better or worse, those systems will continue to become more prevalent over the coming months. In a series of tweets and a blog post, Google claims that because there are now fewer human reviewers available to look over content and appeals, it has no choice but to rely on AI-based alternatives.

Surprisingly, Google acknowledges that this will inevitably lead to more unfair video removals. "More videos will be removed than normal during this time, including content that does not violate our Community Guidelines," YouTube states, bluntly. "We know this will be hard for all of you."

It's unclear why YouTube, of all companies, doesn't have employees that can review content from home, but perhaps the platform's moderation tools require in-office presence for security purposes.

Regardless, creators aren't completely out of luck here. Until such as a time as human moderators are able to take over again, YouTube will not be issuing community strikes (enough of which can cause channel closure) based on videos taken down by the AI; unless there's "high confidence" that the video is "volatile."