Lawsuit alleges harmful content on TikTok contributed to deaths of two teens

midian182

Posts: 10,650   +142
Staff member
What just happened? The impact that social media platforms have on young users' mental health is once again under scrutiny after French families of seven teenage girls filed a lawsuit against TikTok. They allege that the platform exposed their teenage children to harmful content that led to two of them taking their own lives at 15.

Filed in the Créteil judicial court, the lawsuit claims that TikTok's algorithm suggested videos to the teens that promoted suicide, self-harm, and eating disorders.

"The parents want TikTok's legal liability to be recognised in court," lawyer Laure Boutron-Marmion told broadcaster franceinfo. "This is a commercial company offering a product to consumers who are, in addition, minors. They must, therefore, answer for the product's shortcomings."

In September 2023, the family of 15-year-old Marie filed criminal charges against TikTok after her death, accusing the platform of "inciting suicide," "failure to assist a person in danger," and "promoting and advertising methods for self-harm," writes Politico. TikTok's algorithm allegedly trapped Marie in a bubble of toxic content linked to bullying she experienced because of her weight.

TikTok is facing numerous lawsuits in the US over claims that it is harmful to young people's mental health. In 2022, the families of several children who died while trying to participate in a dangerous TikTok challenge sued the company and its parent, ByteDance, after the app allegedly recommended videos of the 'blackout' strangulation challenge to the minors, all of whom were ten years old or under.

Last month, a group of 14 state attorneys general filed lawsuits against TikTok, accusing it of harming children's mental health and violating consumer protection laws. It's alleged that TikTok uses manipulative features to keep young users on the platform for longer. These include endless scrolling, autoplay videos, and frequent push notifications.

It's not just TikTok that remains under the spotlight over the alleged harms it can cause young people. All social media platforms face the same scrutiny. In October last year, the attorneys general of over 40 US states sued Facebook for harming children's mental health.

In a Senate online child safety hearing in January, Meta CEO Mark Zuckerberg apologized to parents in the audience who said Instagram contributed to their children's suicides or exploitation.

The impact of social media on the mental health of not just children but also adults led to the US Surgeon General calling on Congress to apply cigarette-style labels on these sites and apps that alert users to the potential harms they cause.

Social media companies usually hide behind Section 230 of the 1996 Communications Decency Act, which shields them from liability for user-posted content.

TikTok still faces a potential ban in the US. Due to national security concerns over its Chinese ownership, President Joe Biden signed legislation in April requiring ByteDance to divest its US operations by January 19, 2025, or face a nationwide ban.

Permalink to story:

 
Unless Tiktok created the content, blame the people who did instead of the platform.

"TikTok's algorithm allegedly trapped Marie in a bubble of toxic content linked to bullying she experienced because of her weight." The trouble goes away the moment you close the app. Just close the app. I avoid toxic people on social media, it's so easy to block them and their content.
 
Unless Tiktok created the content, blame the people who did instead of the platform.

"TikTok's algorithm allegedly trapped Marie in a bubble of toxic content linked to bullying she experienced because of her weight." The trouble goes away the moment you close the app. Just close the app. I avoid toxic people on social media, it's so easy to block them and their content.
TicTok makes money off of the toxic content so it is only fair to be somewhat financially responsible for the toxic content.

Bad content is the hidden cost of “free” user content. If they never have to pay for that cost, they are not fairly competing in the market with companies that makes sense their own content or police the “free” user content better.
 
TicTok makes money off of the toxic content so it is only fair to be somewhat financially responsible for the toxic content.

Bad content is the hidden cost of “free” user content. If they never have to pay for that cost, they are not fairly competing in the market with companies that makes sense their own content or police the “free” user content better.
I can agree with both sides on this. I don't tiktok is 100% responsible, but they certainly played a roll and profiting off of toxic content does not sit well with me. I don't think tiktok should exist but it does. I also think it's important to, like the guy above you said, to close the app. To the undeveloped brain, especially one suffering from the brain rot of social media shorts, this is not easy.

The platform is designed to be addictive in the pursuit of profit so it is, by association, predatory.

I don't think there is an easy answer here. Just a whole bunch of shades of gray that make interpreting the situation difficult.
 
TicTok makes money off of the toxic content so it is only fair to be somewhat financially responsible for the toxic content.
No. TikTok -- and all other social media platforms -- make money off popular, desirable content: they present a user with what a user wishes to see. If that user wishes to see large amounts of toxic material, that's a psychological issue with them, and no fault of the platform.
 
Back