A Massachusetts trial court judge has denied TikTok’s bid to dismiss the lawsuit against it by the Commonwealth of Massachusetts alleging that its popular social media video sharing site is intentionally designed to be addictive and harmful to young users and the company misrepresents its safety to the public.
The ruling rejected the company’s argument that it is shielded from such lawsuits by federal law, meaning TikTok must face the charges in state court.
The case mirrors one the state is pursuing against Meta Platforms in which the same judge, Suffolk Superior Court Justice Peter B. Krupp, found for the same reasons that a social media firm is not shielded from such lawsuits.
Both TikTok and Meta have argued that Section 230 of the federal Communications Decency Act which grants them immunity from liability for the content they provide shields them from these state lawsuits.
Last month, the state’s Supreme Judicial Court (SJC) upheld Krupp’s previous ruling against Meta, agreeing that Section 230 that shields Internet firms from liability for content posted on their sites does not shield Meta from a lawsuit relating to the features in the design of Instagram that the state alleges harm children’s mental health.
Krupp cited the SJC opinion in denying TikTok’s motion to dismiss.
“I see no meaningful distinction between the cases and the arguments presented and rejected by the SJC in Meta,” Krupp wrote in denying TikTok’s motion for dismissal of the lawsuit “for the reasons set out in Meta and, to the extent not addressed by the SJC, by the rationale that I applied in the Meta case.”
As in its case against Meta regarding its Instagram service, the Commonwealth of Massachusetts alleges that TikTok knowingly designed its video sharing platform with a series of features that, “irrespective of the content delivered, would addict young users, cause them to spend more time on TikTok, and override their ability to disconnect, all for TikTok’s financial benefit and at the expense of the physical and mental health of children.” The challenged features include push notifications; infinite scroll; autoplay; visible social validation metrics; and time-maximizing algorithms.
The state points to internal TikTok documents it says show management was aware of the negative effects of its algorithms—such as sleep disruption and mental health issues—but pushed forward with engaging design for profit.
Massachusetts Attorney General Andrea Joy Campbell filed the suits against Meta and TikTok in October 2024.
In affirming the denial of Meta’s dismissal motion, the SJC noted that the state’s claims do not seek to impose liability for information provided by third parties. Instead, the claims allege harm stemming from the platform company’s own conduct either by designing a social media platform that capitalizes on the developmental vulnerabilities of children or by affirmatively misleading consumers about the safety of the Instagram platform.
The TikTok ruling is the another in a string of recent losses for the technology industry. A California jury awarded $6 million in damages recently to a woman who claimed she became addicted to Meta’s Instagram and YouTube as a child. Also, on March 24, a New Mexico jury hit Meta with $375 million in civil penalties after finding that Meta violated the state’s consumer protection law by misleading users about the safety of Facebook, Instagram and WhatsApp and of enabling child sexual exploitation on those platforms.
Meta, Google, Snapchat and TikTok are facing thousands of lawsuits in state and federal courts tying their designs to a mental health crisis for teens and young people.
Topics Lawsuits Massachusetts
Was this article valuable?
Here are more articles you may enjoy.

Trump Administration Targets Dismantling of Already-Weakened DEI
New York State Has Budget Deal That Includes Auto Insurance Reforms: Gov. Hochul
Spirit Airlines Shuts Down After 34 Years, Blames Higher Oil Prices
Florida Governor Signs Bill Dropping Building Permits for Work Valued at $7,500 or Less 

