The rise of the social media generation has allowed us to instantly share photos, videos, and our opinion (unpopular or not) whenever we want. Unfortunately, many social media platforms such as Facebook, Twitter, and YouTube have been the conduit through which extremist ideologies such as those espoused by ISIS and others have been allowed to flourish and used as a recruitment tool. Governments have since criticized the most influential social media platforms for not taking enough steps to curb extremism (despite the possible censorship and free speech issues).

Last June, it was reported that Twitter may allow certain tweets to be flagged as "fake news". Facebook has also implemented several ways to fight fake news on its platform. Now YouTube is trying its hand at dispelling extremist and terrorist views on its own platform. Today the company announced that it was partnering with Jigsaw, another Alphabet subsidiary, and using their "Redirect Method"; a method in which searching for certain English keywords will result in displaying playlists of videos that debunk "violent extremist recruiting narratives" from ISIS and other terrorist groups.

According to the Redirect Method website, "It focuses on the slice of ISIS' audience that is most susceptible to its messaging, and redirects them towards curated YouTube videos debunking ISIS recruiting themes. This open methodology was developed from interviews with ISIS defectors, respects users' privacy and can be deployed to tackle other types of violent recruiting discourses online."

Apparently it seems to be working as a pilot program for the Redirect Method led to 320,000 individuals viewing "over half a million minutes of the 116 videos we selected to refute ISIS's recruiting themes."

YouTube is working closely with non-governmental organizations (NGOs) to create additional video content that could possibly influence those most susceptible to an extremist ideology. YouTube will also be adding other languages and expanding the program into Europe while using machine learning to dynamically update a list of "trigger" keywords that activate the anti-terrorist playlists.

YouTube previously announced additional steps to combat extremist content to include increasing the number of independent experts in its "Trusted Flagger" program and take a tougher stance on videos that violate YouTube's content policies. While the Redirect Method is focused on Islamic extremism, this approach could presumably counter other extremist hate groups and ideology.

A larger issue still looms over what actually defines "extremist ideology". What's extreme for one person may be just fine to another person. Who gets to decide what is extreme? Could this lead to widespread censorship on every major social media platform? Regardless, it seems that the sheer magnitude of combating extremism will keep Facebook, Twitter, YouTube, and others busy for quite some time.