YouTube announces new measures to fight terrorist content

midian182

Posts: 9,738   +121
Staff member

Following Facebook’s explanation last week about how it is tackling terrorist content, YouTube has just announced the steps it will be taking to address the same issue. In a post that was first published in the Financial Times before appearing online, Google senior VP and general counsel Kent Walker revealed the four new measures being introduced to identify and remove terrorist or violent extremist material.

“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now,” writes Walker.

Like Facebook, Google pledges to increase the use of its technology to help identify terrorism-related content. While AI has been responsible for identifying 50 percent of the videos it removed in the past six months, Google will “devote more engineering resources to apply our most advanced machine learning research to train new 'content classifiers' to help us more quickly identify and remove” extremist material.

Google is also increasing the number of independent experts in YouTube’s Trusted Flagger program. It is adding 50 expert NGOs to the 63 organizations who already take part, and intends to support them with operational grants. Walker says that while some flags reported by users can be inaccurate, the reports prove accurate over 90 percent of the time.

Google is also going to clamp down on videos that, while not clearly violating its policies, come close enough to warrant action. This type of content, which includes inflammatory religious or supremacist content, will not be monetized, appear behind an interstitial warning, and have user comments and recommendations disabled. All of which should make them harder to find and lower user engagement.

Finally, YouTube is expanding its “Redirect Method” more broadly across Europe. The system uses targeted ads to direct potential ISIS recruits toward anti-terrorist videos that will hopefully convince them not to join the organization.

According to Walker: “In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages.”

Google added that it is working with other tech firms, including Facebook, Microsoft, and Twitter, to share resources and technology in the fight against terrorism online.

“Extremists and terrorists seek to attack and erode not just our security, but also our values; the very things that make our societies open and free. We must not let them. Together, we can build lasting solutions that address the threats to our security and our freedoms.”

Permalink to story.

 
So long as this new logic blocks leftist extremist videos as it already does right leaning. Most conservative content creators already deal with YouTube's phantom censorship of demonetization. Some channels get their content demonetized where their leftist equivalent does not. YouTube, Facebook and Twitter are supposed to be open forums for people to utilize, but their controllers bend to political and financial pressure or just choose what they will accept based on their own personal views.

On a side note... How can AI learn what is extremist when it's a matter of perspective? One mans terrorist is another mans freedom fighter. Oh, is it when you plug in your own perspective as the baseline, anything that doesn't align is therefore extremist and blocked or demonetized? So much freedom for those who are within the baseline vs those on the fringe. Am I the only one who sees a problem with this?
 
Another problem youtube has now are "drama" channels that do nothing but attack individuals without proof and even promote cyberbullying. It's become a cesspool of cancer with entire communities formed around them that do nothing but go on twitter or facebook and post hate comments or stuff like that.
Reddit was enough, we don't need youtube to become just as bad.
 
Back