YouTube looking at new ways of stopping misinformation from spreading online

Youtube gets to decide what is "disinformation", I.e. what doesn't fit the democrat/leftist narrative of LIES. MSM and big tech are working to slowly destroy America from within by promoting communistic narratives and denigrating national pride. Why? Because the basic principles of the democrat party are controlled by the rich and famous and it's they're best shot at keeping their money and power.
 
& of course Google/Youtube gets to decide what the right stance on everything is, seemingly Alphabet group that owns googel & ubutt scrapped their old motto "don't be evil". I'm not a right winger I just don't like censoring, that stuff should stay in North Korea & China.
 
The most destructive weapons on earth aren't nuclear. It is the ability to control the flow of information to the masses that poses the greatest threat to humanity. The left right paradigm is a snare for all who refuse to think for themselves. A snare to hold them in place, in line, at odds with one another and oblivious to the true enemies preying on us all.
 
Last edited:
The more youtube does this, the more they push conversation to other platforms, and the more ammunition they give their enemies who are well aware of the leftist bias of youtube. Just how far will google be able to push this, I wonder?
It really isn't that difficult: You don't *need* to implement this complicated systems to remove sharing, remove embedding, post sources on the topic of videos, etc.

You just need a team of a few hundred people at most to go through the top 2 or 3% of the most popular conspiracy videos, extremist and nazi stuff, etc. Just a team of people going through the most popular and going "Boom, video is gone, account is banned, no recourse" would absolutely KILL the algorithm for any other videos on those areas because it really does chain up to the popular videos.

They just gotta want to say "We don't want Anti-vaxx money" for example. All of this other complication is just this nonsense of "Many sides, gotta listen to all perspectives" and many other such centrist talking points because as I said, conspiracy theory content does makes youtube a lot of money and brings in a lot of people into the platform (Even if technically those are not monetized)

So this is just Susan and her team trying to dance around how appear as if they give a crap about these issues while not cutting off a major part of the audience and of the advertising money their algorithm created unsupervised. Purge is not as difficult as they make it out to be, it just hurts financially.
"censorship is OK when it's my team" personified.
Youtube does: they're a private entity and they don't technically have to answer to any moderation of regulation because unlike traditional TV and Radio broadcasting, nobody bothered to think about regulating what's actually said on the internet and trusted the free market would take care of it on it's own.
You need to not only study what the 1st amendment is, but also section 230a, if you think there is *no regulation*. If these companies want to act like publishers and choose what is posted on their platform, they should give up their safe harbor rights.

Oh, and before you go "muh private corporation": it was legal, until 1965, for "private corporations" to deny business, service, or employment to people based on skin color. Saying that private business can discriminate speech because "they are private" is one hell of a slippery slope, one I invite if it results in an internet bill or rights. A single tweak to section 230a would remove the right of youtube to do ANY of this BS, and sooner or later politicians are going to go after it.
 
They should either be a platform and let people be responsible for their own videos or they can be a publisher and be responsible for the content. They want to be both a platform and a publisher at the same time and that doesn't really make sense.
 
All they have to do is give viewers the ability to right-click on the video to enable an option that will prevent them from receiving suggestions from the specific channel hosting the video in the future. Also for the top 10,000 of the channels that will collect the most deactivations from the viewers, there should be a short warning in a text in the style of "possibly misleading content".

Anything else on a higher and / or more general level is censorship. The platform should be as transparent as possible.

As a viewer I like the science fiction why the platform should decide for me what fiction is good and what is bad? I can decide by my self.

They are good to distinguishing what advertisers want but not so good to distinguishing what viewers want. They think that nobody else have the resources to implement a platform to that level so they think that viewers they have no other options and that's why they don’t care too much about that aspect.
 
Last edited:
Too late google. Your "anti-misinformation" program is just a way to control what is THE MISINFORMATION and not multiple variations of it....

a$$h0le tech thinking they can drive the narrative. ...|...

There is no spoon
 
In the early days, when Youtube was a self standing organization, things were pretty simple, direct, with little if any controversy. Now since it's become big business the sky's' the limit. The FCC needs to take a more direct approach, remove ALL limits of legal actions against IT companies and let the chips fall where they may.
 
I wasn't going to comment - but couldn't resist - how are all you guys going to get jobs with such chips on your shoulders ?
Woke employers don't want to employ White People With Attitude

Straight out of Laredo ( apparently 95% plus white - probably extras farm hands )- poco loco considering it's on the Frontera con Mexico

I think you guys get it - But black people ( mainly males ) have been told so many times to drop the attitude if they want to work - Yes Boss, whatever you say Boss - Don't know about that Boss , just do my job with a smile Boss .. Si Jefe
 
Youtube does: they're a private entity and they don't technically have to answer to any moderation of regulation because unlike traditional TV and Radio broadcasting, nobody bothered to think about regulating what's actually said on the internet and trusted the free market would take care of it on it's own.
 
Back