TikTok must face lawsuit over 10-year-old girl's death, US court rules.
A U.S. appeals court has revived a lawsuit against TikTok by the mother of a 10-year-old girl who died after taking part in a viral "blackout challenge" in which users of the social media platform were dared to choke themselves until they passed out.
While a federal law typically shields internet companies from lawsuits over content posted by users, the Philadelphia-based 3rd U.S. Circuit Court of Appeals on Tuesday ruled the law does not bar Nylah Anderson's mother from pursuing claims that TikTok's algorithm recommended the challenge to her daughter.
U.S. Circuit Judge Patty Shwartz, writing for the three-judge panel, said that Section 230 of the Communications Decency Act of 1996 only immunizes information provided by third parties and not recommendations TikTok itself made via an algorithm underlying its platform.
She acknowledged the holding was a departure from past court rulings by her court and others holding that Section 230 immunizes an online platform from liability for failing to prevent users from transmitting harmful messages to others.
But she said that reasoning no longer held after a U.S. Supreme Court ruling in July on whether state laws designed to restrict the power of social media platforms to curb content they deem objectionable violate their free speech rights.
In those cases, the Supreme Court held a platform's algorithm reflects "editorial judgments" about "compiling the third-party speech it wants in the way it wants." Shwartz said under that logic, content curation using algorithms is speech by the company itself, which is not protected by Section 230."TikTok makes choices about the content recommended and promoted to specific users, and by doing so, is engaged in its own first-party speech," she wrote.
TikTok did not respond to requests for comment.
(Score: 5, Insightful) by Runaway1956 on Saturday August 31, @12:22PM (3 children)
I don't have TikTok, but I have a Facebook account. Person A doesn't nag me, personally, every time he/she makes a post. Facebook does that, not Person A. Nor does Application Z nag me with every update, instead Application Z relies on Facebook algorithms to nag me. Group Q is the same - the Group doesn't nag me with every update, instead relying on Facebook algorithms to do that. Person B doesn't search for me on Facebook, then recommend him/herself as a new friend. Again, Facebook does that.
Section 230 shouldn't protect any of that. Facebook is publishing all of it, and pushing it to the user. In the case of that scantily clad young female, we could probably make a case of procurement against Facebook. Seriously, why is Facebook constantly trying to introduce me to nearly naked young women, if not to promote prostitution? In most cases, the young lady and I have absolutely nothing in common - no mutual friends, no family relation, no apparent interests, nothing. We didn't go to the same schools, often don't live in the same state, don't go to any of the same places. Nothing to connect us, but there she is, offered up as a new "friend". Assuming she is an innocent young lady, and not a hooker, is her feed filled with old men like myself? That's a creepy idea!
A MAN Just Won a Gold Medal for Punching a Woman in the Face
(Score: 3, Insightful) by JoeMerchant on Saturday August 31, @02:28PM
>why is Facebook constantly trying to introduce me to nearly naked young women
Because you are constantly clicking on similar photos?
I agree: "Section 230 of the Communications Decency Act of 1996 only immunizes information provided by third parties and not recommendations"
When I click on "show me less posts like this" and, instead, the service [wikipedia.org] shows more, my only option seems to be to abandon the service altogether. Unfortunately, In the case of Facebook, they control the vast majority of content available on many topics (classic cars for sale in my part of the world, for one) and, so, I am held hostage to their algorithms which seem intent upon presenting as little as possible of what I am interested in, stuck in-between self serving stuff like friend proposals, junk I don't want, tik tok modeled "Reels" etc.
Latest trap was a teaser story that, when "continue" was clicked, asked for Android permissions and spammed a bunch of McAfee links flashing full screen.
When the algorithms aren't capable of protecting the users from harm, I do believe the service has a responsibility to improve them, or face suits and liability for damages. Teenagers strangling themselves to death is a well known hazard easily spread by information services including word of mouth . Oprah touched the issue in an attempt to help prevent it in the 1980s.
🌻🌻 [google.com]
(Score: 5, Interesting) by Mojibake Tengu on Saturday August 31, @03:00PM
You were selected as a target for compromise. So, you are fed with baits. Starting with innocent baits, at some break point you will receive a critical bait. If you fall for a critical bait, you will be criminalized or forced to comply with performing something.
This methodology was developed in 70's, used regularly by agencies at both sides of Iron Curtain. Often collaboratively.
Namely, StB (Czechoslovak inner security agency) called that method initial stage "creating under mushrooms" [podhoubí] and forcing stage "harvest of mushrooms" [sklizeň hříbků]. People without proper emotional self-control fall to baits easily.
CIA received exact escalation methodology directly from KGB (Yuri Bezmenov case).
Facebook is nothing but electronic version of that. Remember, Twitter was originally created as a similar tool for Africa at times when there were no computers, only phones capable of SMS (text) messaging, thus the original limit.
So, everyone sane enough out there and your grandmother understands well these 'algorithms'.
Rust programming language offends both my Intelligence and my Spirit.
(Score: -1, Troll) by Anonymous Coward on Sunday September 01, @01:27AM
You're absolutely right. We must trash the CDA completely, with extreme prejudice. It is a direct violation of the 1st Amendment.
If you don't like Facebook's content, don't use it, that simple
It's funny to see you stand up for the anti-vaxxers and Musk, et al, and then complain about Facebook and the libraries. Obviously your belief in free speech is very selective.
(Score: 5, Insightful) by HiThere on Saturday August 31, @01:28PM (10 children)
That sounds like the correct distinction to me. The recommendation *is* an action by the company. though the post is a post by the user. When the company asserts editorial judgement, then the company is the one acting. (Of course, social media would be useless without editorial judgement, but that could be done by something like "community moderation" if the company didn't want the responsibility. (Community moderation would imply that the company did not reimburse the moderators in ANY way. Not even as much as an NFT of something that was public domain.)
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 4, Interesting) by JoeMerchant on Saturday August 31, @03:08PM (3 children)
I would agree: SN is pretty clearly "community moderated" whereas Facebook, X, Google and the other mega-services are clearly "shaping" the content presented to the users via closed proprietary algorithms.
Transparency is always the answer. Want protection under 230? Publish your algorithms and prove they are the only thing shaping content selection choices to the users, free of other editorial input.
Note: this in no way eliminates the possibility of suppression of hate speech, misinformation, pornography, or stupid cat videos, it just requires disclosure of the algorithms, and their controlling data.
Cue the "security via obscurity" strawmen:
🌻🌻 [google.com]
(Score: 0) by Anonymous Coward on Sunday September 01, @08:55AM (2 children)
Does the community really choose what stories are run though? It seems more like that's under the control of a few people.
Users can submit stuff but so can the "few people". And the few people get to choose what appears. Not the "community".
If that's community moderated then the Chinese Government is about as community moderated as SN too.
(Score: 2) by janrinok on Sunday September 01, @10:50AM
I had to run my bot this morning because we were down to 3 submissions. We normally aim for twice as many subs as we require for a day (i.e. we need 10 in the list). This ensures that we can provide a variety of topics for discussion. If the community would submit more they will get published more. We prioritize submissions from the community and only revert to the bots' when we do not have enough variety to fill the pages.
Your last submission under your username was in 2017. It was published. You might have submitted more since then as AC.
I am not interested in knowing who people are or where they live. My interest starts and stops at our servers.
(Score: 2) by owl on Sunday September 01, @03:37PM
The community submits items via the Submit Story [soylentnews.org] page (have you submitted anything?).
A group of editors (I don't know how many) decides when each submission runs (time of day) and occasionally does not let through a few submissions (presumably because the submissions are bad (spam, etc.) or not likely to be of interest to the community). I.e., they, somewhat, 'moderate' the incoming submissions.
And from Janirock's other post, in instances where the community has not submitted sufficient stories, one or more of the editors run bots to gather something to have in the queue.
So, if you want to see the editors do fewer submissions, then start doing more submissions yourself.
(Score: 4, Insightful) by owl on Saturday August 31, @07:25PM (5 children)
Except, that is not what section 230 says [techdirt.com]. The sites are free to moderate to whatever extent they see fit, provided the content being moderated was created by users and not the site.
The distinction here is that the "recommendations" are not "content created by the users" so the "recommendations" end up likely invoking liability (as the site creates the "recommendations"). The content inside the "recommended for you" links, that's still created by users, and the site has liability protection. The recommendation listing itself, that's content created by the site, and that's where this new case is tossing a grenade into the section 230 shield.
(Score: 3, Interesting) by JoeMerchant on Saturday August 31, @09:47PM
I would find "recommending" user created content showing 50 creative ways to commit suicide worthy of liability in a case where a child saw the recommended content and then committed suicide.
Can't control your recommendation algorithms? Get out of the business, or prepare to pay for the actual damages they contribute to.
🌻🌻 [google.com]
(Score: 2) by HiThere on Sunday September 01, @03:28AM (3 children)
I'm no lawyer. You may be correct about what the law says, but if so the law is wrong, and needs to be changed.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by owl on Sunday September 01, @04:25AM (2 children)
Don't get the wrong idea, there is still liability, but the liability resides with the user who created the content, not the site.
What 230 does is prevent "ambulance chasing" lawyers from suing the party with deep pockets (read as: "hefty potential payout if lawsuit wins" or "likely to settle for a sizable sum just to make it go away") instead of suing the individual user who created the content (read as: "likely not enough assets to be worth the lawyer's time").
The liability rests with the one who created the content. So if @HiThere posted something on Facebook that "crossed the line", said ambulance chasing lawyer can't sue Facebook for hosting it, simply because Facebook's pockets are quite obviously deeper than @HiThere's pockets. That lawyer has to sue @HiThere, the creator of the content.
(Score: 2) by HiThere on Sunday September 01, @01:39PM (1 child)
That part is fair, but letting the one who selected "this message" out of all the others to present off the hook is not fair. Yes, there's a "deep pockets" problem, but that shouldn't exonerate them.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by owl on Sunday September 01, @03:29PM
Which is exactly the gernade this court case has thrown into the section 230 protections.
This case has now interpreted algorithmic created "recommended for you" lists as "created by the site" -- and so legal issues arising from those lists (assuming the court case stands) now confer liability onto the site.
So what this case has done is change the law (well,change the interpretation of what falls under the law) such that sites no longer get to escape liability for the 'recommended' lists by arguing that it's just a listing of user generated content and all the liability lies with the users that generated the individual bits of content. Suddenly, now, the site itself is on the hook for liability (secondary liability, but zero to X is an infinite percent change in risk even for secondary liability) for having created the recommended lists. If this case stands, we very well might see the "recommended for you" listings disappearing everywhere, to remove that risk.
(Score: 3, Touché) by srobert on Saturday August 31, @03:07PM (5 children)
I'm sorry I know the focus of this article was supposed to be on the media providers' responsibility for user content, but the aspect of this story that caught my attention was there are people who will respond to a dare to choke themselves until they're unconscious. I can't help but wonder whether helping these people survive to the reproductive stage is advantageous to the future of our species.
(Score: 3, Insightful) by mrpg on Saturday August 31, @04:02PM (2 children)
She was 10 years old, maybe the brain is not fully developed yet so she was not responsible for her actions, like when you don't give voting rights to minors.
(Score: 0) by Anonymous Coward on Saturday August 31, @07:03PM
We don't allow miners to vote? Hmmmm . . .
(Score: 3, Insightful) by Joe Desertrat on Monday September 02, @12:11AM
Why then were her parents not monitoring her internet usage? TikTok is known for hosting a lot of out there content. Maybe if attention was paid to this sort of thing instead of seeing a boob or banning books about lifestyles not condoned, this would not have happened.
(Score: 2) by JoeMerchant on Saturday August 31, @04:41PM
> I can't help but wonder whether helping these people survive to the reproductive stage is advantageous to the future of our species.
The CO2 buildup sensing system isn't always well tuned in the human (and others') brain. Sudden Infant Death Syndrome is frequently traceable to an infant placed face down with their head angled backwards. The back-turn of the head compromises circulation to the brain center that causes gasping reflex in response to CO2 buildup, the face down posture is not only called "prone" but also is prone to exhaled CO2 buildup. The two together combine to cause a significant number of infant deaths. Thus: the "back to sleep" campaign which urges new parents to place their infants on their backs for sleeping.
In the 1980s (and before) there was a pre-internet viral meme connecting self asphyxiation to heightened pleasure during orgasm - whether with partner(s) or alone. Spreading of this story, whether true or false, leads to a significant number of adolescent deaths in the "alone" scenario. Apparently, promise of a better orgasm isn't the only thing that will drive children to kill themselves this way, TikTok found that a simple dare is enough, at least for this 10 year old.
Meanwhile, adults are out there risking, and receiving, brain damage in the pursuit of world records: https://en.wikipedia.org/wiki/Static_apnea [wikipedia.org] The fame they receive probably increases their chances of successful mating...
🌻🌻 [google.com]
(Score: 2) by sjames on Sunday September 01, @08:20AM
10 year olds have not fully developed their frontal cortex yet, and so are subject to not appropriately anticipating things that may go wrong. It's nothing that a few more years won't usually fix.
If it happens to someone in their mid 20s, I would be a bit more sympathetic to your view.
(Score: 4, Interesting) by Snotnose on Saturday August 31, @03:12PM (3 children)
Back in the 70s when I was in Jr High (age 12-14) one of my classmates took the blackout challenge, fell down, hit his head, and died. TBH it was a huge relief for me and my friends, this guy used to bully the hell out of us daily.
The secret to success is to never run from hard work. A brisk walk usually suffices.
(Score: 3, Funny) by srobert on Saturday August 31, @04:39PM
That could be the premise of an interesting movie. Imagine a film about a bunch of kids who don't feel the way adults think they should feel about the death of a classmate.
(Score: 0) by Anonymous Coward on Saturday August 31, @07:05PM
Darwin works in mysterious ways, eh?
(Score: 0) by Anonymous Coward on Monday September 02, @11:06AM
You know, if I was the suspicious type I might wonder about how many times he had to fall down and hit his head to die.
(Score: 2, Touché) by Rosco P. Coltrane on Saturday August 31, @04:31PM
Let's sue the studio!
(Score: 3, Insightful) by Gaaark on Saturday August 31, @07:52PM (1 child)
I thought they were banning Tiktok 'cos ... CHINESE!!! Did i miss something?
Not really caring because i don't do FriendFace or TicTac or such.
--- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
(Score: 2) by owl on Saturday August 31, @09:18PM
Here's the details [archive.is] (archived NYT article).
This, however, is different, this is a court case accusing TikTok of some form of negligence. It is unrelated to "the ban" from the archived NYT article above.