TikTok Found Not Liable in Death of Girl in Strangulation Challenge

By | October 28, 2022
New You can now listen to Insurance Journal articles!

Federal communications law shields the video-sharing site TikTok Inc. from responsibility for the death of a 10 year-old girl who died while attempting a challenge on TikTok that encourages viewers to strangle themselves using household items.

A federal judge in Pennsylvania found that Section 230 of the federal Communications Decency Act compelled him to dismiss the claims brought by Nylah Anderson’s mother against TikTok because the law grants broad immunity to sites for third-party content.

Hiding in a bedroom closet, Nylah attempted the so-called Blackout Challenge. Her mother found Nylah unconscious, hanging from a purse strap. She unsuccessfully attempted to revive her daughter with CPR. According to information from the court, three deep ligature marks on Nylah’s neck confirmed that she had suffered while struggling to free herself. After several days in intensive care, Nylah died.

Section 230 provides that “no provider or user of an interactive computer service shall be treated as the publisher of any information provided by another information content provider.” It further prohibits the bringing of any cause of action and the imposition of any liability under any state or local law that is inconsistent with this immunity.

Nylah’s mother filed products liability, negligence, wrongful death, and survival claims against TikTok. Anderson urged the court to hold TikTok liable as a designer, manufacturer, and seller of a defective product, not for conduct as a publisher. She also presented evidence that TikTok knew that its algorithm promoted the challenge to children and alleged four other children have died attempting the challenge.

But the court determined that the various “creative” claims could not defeat the reality that the challenge posted on TikTok’s site was created by others and TikTok cannot be held liable as a publisher. Judge Paul S. Diamond wrote that Section 230 precludes Anderson’s products liability and negligence claims—on which her wrongful death and survival claims depend.

“What matters is not the name of the cause of action—defamation versus negligence versus intentional infliction of emotional distress—what matters is whether the cause of action inherently requires the court to treat the defendant as the ‘publisher or speaker’ of content provided by another.”

In precluding interactive service providers from being treated as the publishers of their third-party content, Congress immunized the providers’ “decisions relating to the monitoring, screening, and deletion of content from [their] network[s]—actions quintessentially related to a publisher’s role,” according to the court.

Other Challenges

TikTok, which is owned by China’s ByteDance Ltd., is facing other challenges and criticisms.

A group of states is probing whether the social media platform is being improperly marketed to children. Police are concerned another of the videos on its site teaches people how to hot-wire and steal cars.

Another suit blames the site for the death of a 14-year-old African-American girl. That complaint claims that TikTok’s algorithm steers more violent videos to minority viewers than White users.

Texas is probing TikTok for potential human trafficking and child privacy violations.

In past years, there have been other court challenges to the broad immunity social media platforms enjoy, including ones attempting to hold Internet sites responsible for a terrorist attack and a mass shooting, but courts have upheld the broad immunity.

There has also been talk in Washington among politicians on both sides of the aisle about revising the immunity law.

Earlier this month, the U.S. Supreme Court agreed to hear a case over whether social media companies can be sued over targeted content recommendations. The complaint maintains Google bears some of the responsibility for a 2015 ISIS terrorist attack.

Judge Diamond appeared to acknowledge that some people have questioned Section 230. In his summary closing he wrote:

“Nylah Anderson’s death was caused by her attempt to take up the “Blackout Challenge.” Defendants did not create the Challenge; rather, they made it readily available on their site. Defendants’ algorithm was a way to bring the Challenge to the attention of those likely to be most interested in it. In thus promoting the work of others, Defendants published that work—exactly the activity Section 230 shields from liability. The wisdom of conferring such immunity is something properly taken up with Congress, not the courts.”

Was this article valuable?

Here are more articles you may enjoy.