YouTube looking at new ways of stopping misinformation from spreading online

midian182

Posts: 7,907   +82
Staff member
In context: YouTube’s 22.8 billion visitors per month make it the world’s most popular website behind Google.com, which is what makes it such an enticing service for those spreading misinformation. It’s something the Google-owned site has long tried to combat, and it is now making a renewed push at stopping these narratives.

Neal Mohan, Chief Product Officer at YouTube, wrote an extensive post about tackling misinformation on the service. It focuses on three areas, the first of which is stopping these videos before they go viral. It classifies conspiracy theories such as claims 5G caused the spread of the coronavirus as an example of content that violates its guidelines, but some new narratives are too fresh to be caught by the company’s systems. As such, YouTube will “leverage an even more targeted mix of classifiers, keywords in additional languages, and information from regional analysts to identify narratives our main classifier doesn’t catch.”

YouTube's four Rs of Responsibility

The second area of concern is the sharing of misinformation across platforms. YouTube says it has lowered the number of recommendations it makes for “borderline” videos that don’t quite warrant removal, but these are often promoted on other sites with links and embeds. The company has considered removing the share button or disabling links for these videos but worries that it may be going too far and restricting viewers’ freedoms. It is also considering an interstitial warning that the clip may contain misinformation.

Finally, YouTube is looking to better tackle misinformation in languages other than English. It notes that what’s considered borderline content varies in each country. One option is to partner with non-governmental organizations to better understand regional and local misinformation.

As with all internet platforms, YouTube must walk a fine line between banning anything it considers harmful and overextending its reach to the point where it’s accused of censorship. "We need to be careful to balance limiting the spread of potentially harmful misinformation, while allowing space for discussion of and education about sensitive and controversial topics," Mohan said.

Permalink to story.

 

Dimitriid

Posts: 2,212   +4,254
It really isn't that difficult: You don't *need* to implement this complicated systems to remove sharing, remove embedding, post sources on the topic of videos, etc.

You just need a team of a few hundred people at most to go through the top 2 or 3% of the most popular conspiracy videos, extremist and nazi stuff, etc. Just a team of people going through the most popular and going "Boom, video is gone, account is banned, no recourse" would absolutely KILL the algorithm for any other videos on those areas because it really does chain up to the popular videos.

They just gotta want to say "We don't want Anti-vaxx money" for example. All of this other complication is just this nonsense of "Many sides, gotta listen to all perspectives" and many other such centrist talking points because as I said, conspiracy theory content does makes youtube a lot of money and brings in a lot of people into the platform (Even if technically those are not monetized)

So this is just Susan and her team trying to dance around how appear as if they give a crap about these issues while not cutting off a major part of the audience and of the advertising money their algorithm created unsupervised. Purge is not as difficult as they make it out to be, it just hurts financially.
 

winjer

Posts: 427   +1,952
It really isn't that difficult: You don't *need* to implement this complicated systems to remove sharing, remove embedding, post sources on the topic of videos, etc.

You just need a team of a few hundred people at most to go through the top 2 or 3% of the most popular conspiracy videos, extremist and nazi stuff, etc. Just a team of people going through the most popular and going "Boom, video is gone, account is banned, no recourse" would absolutely KILL the algorithm for any other videos on those areas because it really does chain up to the popular videos.

They just gotta want to say "We don't want Anti-vaxx money" for example. All of this other complication is just this nonsense of "Many sides, gotta listen to all perspectives" and many other such centrist talking points because as I said, conspiracy theory content does makes youtube a lot of money and brings in a lot of people into the platform (Even if technically those are not monetized)

So this is just Susan and her team trying to dance around how appear as if they give a crap about these issues while not cutting off a major part of the audience and of the advertising money their algorithm created unsupervised. Purge is not as difficult as they make it out to be, it just hurts financially.

True. But that would require some effort.
And that goes against their modus operandi.
 

Dimitriid

Posts: 2,212   +4,254
True. But that would require some effort.
And that goes against their modus operandi.

That's the thing: to me both scenarios take effort: On the side they're on right now there's actually significant effort: it's not going to be easy to code something that targets a specific video and disables embedding and does this automatically for all non-white-listed covid content for example. It might not be super complicated but it will take some coders more than a few evenings to implement, test, deploy, etc.

On the other hand, the effort of saying "We're ready to delete all these people and start a long political and probably legal battle, send our people to GOP lead congressional hearings, the works" also takes effort.

The issue is that well quite frankly, coders are cheaper to obtain than lawyers and PR people and second and this is the kicker, I truly believe Susan Wojcicki herself and Alphabet as a whole entity actually truly believe in letting anti-vaxxers, right wing conspiracies, etc. Both live and thrive on their platform as a matter of free speech because they truly believe anything that makes them money should be fair game and are likely annoyed that they have to give any credit to most reasonable people that want to curve down on misinformation while they fundamentally would like to profit from it: A CEO probably truly believes on the "Free marketplace of ideas"
 

Dimitriid

Posts: 2,212   +4,254
LOL who gives youtube the right to determine if the information is disinformation or not?
Youtube does: they're a private entity and they don't technically have to answer to any moderation of regulation because unlike traditional TV and Radio broadcasting, nobody bothered to think about regulating what's actually said on the internet and trusted the free market would take care of it on it's own.
 
Last edited:

QuantumPhysics

Posts: 6,308   +7,247
Youtube does: they're a private entity and they don't technically have to answer to any moderation of regulation because unlike traditional TV and Radio broadcasting, nobody bothered to think about regulating what's actually say on the internet and trusted the free market would take care of it on it's own.


The Free Market will ultimately take care of it. People who don't like Facebook, Youtube, Twitter or IG should stop using them. Let them lose so many users and so much money that they are either forced to do introspection on their policies or watch themselves burn.

I reject any intervention in the Free market of US social media companies - except if they themselves break US law.
 

psycros

Posts: 4,156   +5,803
That's the thing: to me both scenarios take effort: On the side they're on right now there's actually significant effort: it's not going to be easy to code something that targets a specific video and disables embedding and does this automatically for all non-white-listed covid content for example. It might not be super complicated but it will take some coders more than a few evenings to implement, test, deploy, etc.

On the other hand, the effort of saying "We're ready to delete all these people and start a long political and probably legal battle, send our people to GOP lead congressional hearings, the works" also takes effort.

The issue is that well quite frankly, coders are cheaper to obtain than lawyers and PR people and second and this is the kicker, I truly believe Susan Wojcicki herself and Alphabet as a whole entity actually truly believe in letting anti-vaxxers, right wing conspiracies, etc. Both live and thrive on their platform as a matter of free speech because they truly believe anything that makes them money should be fair game and are likely annoyed that they have to give any credit to most reasonable people that want to curve down on misinformation while they fundamentally would like to profit from it: A CEO probably truly believes on the "Free marketplace of ideas"

Could you be any more transparently biased? After showing your partisanship and desire to purge all opinion you dislike, nothing else you say has any relevance.
 

terzaerian

Posts: 1,265   +1,771
Could you be any more transparently biased? After showing your partisanship and desire to purge all opinion you dislike, nothing else you say has any relevance.
When the systems and principles that keep people like this in a position to pontificate against them do finally and irrevocably fall apart, they're not going to like what comes after.
 

Dimitriid

Posts: 2,212   +4,254
You do not need youtube to tell you that. Use your common sense.
Can't talk about common sense when there isn't even a common language: Even what you think would be common sense (Which isn't as common as you think it is) only really counts in this case for native English speakers.

Youtube can be viewed all over the world, often with poorly constructed auto translation features so somebody legitimately not knowing how to translate "bleach" to their own language and just seeing this bottle of an american medicine to combat covid could just end up drinking bleach through no fault of their own.

Every time you post stuff like this you only shows the limits of this self-regulation or no regulation trusting on "common sense" even at face value. The fact that they took years to take down Alex Jones talking about literal vampire pot-bellied goblins and he using that to run an extremely successful business of literal brain pills shows that no, common sense isn't enough.
 
Last edited:

hahahanoobs

Posts: 4,461   +2,430
Silence them all, I don't care. Better than the noise I'm hearing now.

I wouldn't hang out with a single one of these "Freedom fighters". They can barely speak full sentences. They def aren't all that educated, unless you call 500 hours on social media an education. These people just need better friends.
 
Last edited:

shark975

Posts: 80   +97
Of course we know misinformation=conservative political views.

saw somewhere facebook and ig have started fact checking/censoring posts about INFLATION now. Not covid or something they can at least claim is a public health issue, nope they're directly acting as an political arm of the Democrat party now.
 

Dunkerton

Posts: 59   +114
Youtube does: they're a private entity and they don't technically have to answer to any moderation of regulation because unlike traditional TV and Radio broadcasting, nobody bothered to think about regulating what's actually said on the internet and trusted the free market would take care of it on it's own.
lol imagine defending big tech
 

Tantor

Posts: 315   +580
It really isn't that difficult: You don't *need* to implement this complicated systems to remove sharing, remove embedding, post sources on the topic of videos, etc.

You just need a team of a few hundred people at most to go through the top 2 or 3% of the most popular conspiracy videos, extremist and nazi stuff, etc. Just a team of people going through the most popular and going "Boom, video is gone, account is banned, no recourse" would absolutely KILL the algorithm for any other videos on those areas because it really does chain up to the popular videos.

They just gotta want to say "We don't want Anti-vaxx money" for example. All of this other complication is just this nonsense of "Many sides, gotta listen to all perspectives" and many other such centrist talking points because as I said, conspiracy theory content does makes youtube a lot of money and brings in a lot of people into the platform (Even if technically those are not monetized)

So this is just Susan and her team trying to dance around how appear as if they give a crap about these issues while not cutting off a major part of the audience and of the advertising money their algorithm created unsupervised. Purge is not as difficult as they make it out to be, it just hurts financially.


Very true, Youtube could easily purge everything they don't want. Their problem is how to censor without angering the broader public and losing income. Everyone knows that the Leftist 'woke' media like Youtube are full-on conspiring to shut down traditional Americans.

The famous economist Adam Smith once said: "People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.” ― Adam Smith, Wealth of Nations

Here's the truth:

1) Conspiracies are the norm, not the exception. Any group that actively hides its efforts against the public is engaging in conspiracy. Conspiracies are everywhere. The Holocaust was a conspiracy of top level German officials. 911 was a conspiracy of middle eastern leaders.

2) Theorizing about Conspiracies is exactly what intelligence agencies do, it's their JOB for heaven's sake. The CIA, the NSA, Military intelligence. All of them spend enormous amounts of time
- theorizing about conspiracies.
- theorizing about ways to to protect against the conspiracies.
- conspiring to perform their own covert actions.

3) Governments actively conspire against their own people. This naturally leads to people actively theorizing about government conspiracies.

Regarding Youtube. Youtube is not a private agent. They operate on the public internet, which was created with public money to SERVE PUBLIC INTERESTS. They're using infrastructure that effectively belongs to the public. If they want to censor people, then they should get off the internet. Youtube is like a big bus that claims it's public transport, but which actively discriminates against half the population. It's far worse than the Jim Crow era which put blacks in the back of the bus. With Youtube, pro-white advocates aren't even allowed on the bus.

5) All of this is natural. Nature is full of deception, hidden messages, and covert actions.