Google Lawyer says internet could become a "Horror Show" without Section 230

Daniel Sims

Posts: 1,374   +43
Staff
Why it matters: A 1996 US law that underpins much of how the internet works today faced its latest challenge in the Supreme Court this week. Lawmakers and big tech executives have wrestled with the idea of repealing or rewriting it, and a lawyer representing Google has expressed the latest strong warning on the topic.

The US Supreme Court heard oral arguments Tuesday on a case that could have major ramifications for how the internet works. A Google lawyer warned that if the company loses the case, the internet could become very different and much worse off.

The case, Gonzalez vs Google, is set to determine whether YouTube violates the federal Anti-Terrorism Act whenever its algorithm recommends its users videos from the terrorist group ISIS, broadening the group's message. It was brought on by the family of Nohemi Gonzalez, an American student killed in a 2015 ISIS attack in Paris.

At the heart of the issue is Section 230 of the 1996 Communications Decency Act, which says internet platforms aren't liable for the content their users publish. The legislation has come under intense scrutiny over the last few years, as many say it allows hateful and defamatory content to spread through social media.

Lawmakers have pushed to repeal Section 230, while tech leaders like Mark Zuckerberg prefer it to be rewritten or amended. Defenders of the legislation claim that it's impossible for platforms to effectively monitor every piece of content users create that someone could see as offensive or dangerous.

Google's attorney Lisa Blatt claimed the internet would become unrecognizable without Section 230 because platforms would be forced to either tightly filter all published content or filter nothing at all, turning it into a "horror show." The law protects sites from lawsuits for both publishing and removing content.

Blatt admits that big companies like Google could weather such a drastic shift, but warns that smaller sites couldn't. She also said the early internet would have never succeeded if every platform could be sued for user-generated content.

However, Gonzalez family attorney Eric Schnapper claims this case doesn't fall under Section 230 because it concerns the YouTube algorithm that recommends ISIS videos, not the users who post them. Despite that distinction, algorithms are integral to how many of today's platforms operate.

The court justices acknowledged the importance of the case but also admitted their confusion regarding the subject. Justice Elana Kagan said the nine justices aren't the top nine experts on the internet. Justice Brett Kavanaugh suggested that Congress, which wrote Section 230, should be the body that changes or repeals it. It's possible that an opinion on the law might not emerge from this case.

Permalink to story.

 
Considering big tech has been hiding behind section 230 as both platforms and publishers pretending to only be the former (and showing blatant bias for one side of politics over the other), it is amusing seeing people's hyperbole over this...

Maybe this will finally be the catalyst to fix 230 (without much big tech lobbying)...
 
In my view, Internet is already unusable, makes me wish to stay with RSS permanently.

Pity that TechSpot doesn't support RSS properly. It only provides an intro note for articles, requiring you to open the website in order to read the article itself. That's not fun.
 
Last edited:
In my view, Internet is already unusable, makes me wish to stay with RSS permanently.

Pity that TechSpot doesn't support RSS properly. It only provides an intro note for articles, requiring you to open the website in order to read the article itself. That's not fun.

No add revenue or click tracking on RSS, I don't think it's a technical issue at all.
 
The only horror show will be for big Tech and their irresponsible behavior over the years. When 230 was passed is should have had a limited lifespan, which would have gone a long way in resolveing the problem painlessly. At this point, with all the harm they have allowed, they need to pay for their abuses without any excuses .... period.
 
It would be a horror show for their profits because they would have to hire people to screen content before it's posted. begin sarcasm: It is clearly outrageous for a big tech company to not make a trillion dollars a year for hosting a web site filled with other people's content; end sarcasm.
 
I think it's pretty clear it needs to be rewritten. They want to be publishers and moderate content, but they don't want to be responsible for the content that they're curating and amplifying?
Realistically, a platform can't possibly catch everything that's *illegal* (no, not merely annoying/fee-fee hurting) automatically anyway. We need a specific few sections in a law that clearly delineates what they can and cannot do.
 
"whenever its algorithm recommends its users videos from the terrorist group ISIS, broadening the group's message."

Youtube didn't recommond ISIS promoting videos before terrorists looked for that type of content first. It's not as if a person won't be able to find something to reenforce their alternate view of the world. Youtube didn't turn those people into terrorists they were already there when they started looking for videos. There is no way anyone could look at the content of every video on youtube and there's no way, at this time, videos promoting terrible things can be detected automatically.
 
I GOT IT! Use AI to 'effectively monitor every piece of content users create'.
This also could effectively replace Big Brother.
(sorry, been a rough few days)
 
I GOT IT! Use AI to 'effectively monitor every piece of content users create'.
This also could effectively replace Big Brother.
(sorry, been a rough few days)
That is one of the plans for AI, of course.

AI is going to be the boot of our dear brother and sister. You have 'think of the military' from brother and 'think of the children' from sister.

We live in a world where incinerating plastics can be called 'advanced recycling.' Don't expect truth to be at the fore but expect the AI to be behind all of the curtains.

Or, as Bing might say, 'Some results have been removed.'
 
Back