Meta and YouTube Held Liable for Social Media Addiction: A Landmark Verdict Explained

A jury found Meta and YouTube negligent in the design and operation of their social media platforms — making this the first lawsuit to take the tech giants to trial over social media addiction. Meta and Google must now pay damages to a 20-year-old woman who said her addiction to social media caused her mental health struggles. The verdict is already reshaping how courts, parents, and regulators think about platform accountability.


What the Trial Was About

A California jury found Meta and YouTube liable on all counts, accusing the tech giants of intentionally addicting a young woman and injuring her mental health.

The plaintiff, referred to as “Kaley,” began using Instagram as a child. Kaley claims the app’s addictive features led her to develop anxiety, body dysmorphia, and suicidal thoughts, and that she experienced bullying and sextortion on Instagram. Her lawyers argued that this was not a personal failing — it was a design choice.

Kaley sometimes used Instagram for several hours a day and was once on the platform for more than 16 hours in a single day, despite her mother’s attempts to curb her use.


How Meta and YouTube Designed for Dependency

The trial exposed specific product decisions that plaintiffs argued fueled media addiction. Social media companies bombard users with notifications, offer interfaces built on infinite scrolling, and run algorithms that promote misleading and harmful content. Each of these features is engineered to keep users on the platform longer.

Until recently, US courts largely denied motions focused on design — including infinite scroll and notification systems. The distinction between “platform design” and “content curation” has been central to how courts have analyzed First Amendment arguments in this litigation. This verdict is the first to cross that line.

Meta disputed the core claims. Zuckerberg said that while Meta previously had goals related to the amount of time users spent on the app, it has since changed its approach. However, the jury was not persuaded.


The Scale of Social Media Addiction

This case isn’t isolated. The broader statistics on social media addiction are difficult to ignore.

Globally, the average person spends two hours and 27 minutes on social media every day. In a 2023 Pew Research survey, 46 percent of teens said their internet use was almost constant.

The mental health consequences are measurable:

  • Nearly 40 percent of high school students experienced persistent feelings of sadness and hopelessness, and more than 20 percent seriously considered suicide within the past year, according to 2024 CDC data.
  • Roughly 42 percent of teens admit that social media keeps them from connecting with friends in person.
  • Instagram was rated the most detrimental to mental health among popular apps, with teenagers between the ages of 14 and 17 reporting increased anxiety, depression, and loneliness.

Research also shows that social media addiction is more behaviorally reinforcing than alcohol or cigarettes — not because of a chemical substance, but because of feedback loops built into the product.


What Meta Has Done in Response

In September 2024, Meta introduced Instagram Teen Accounts, which automatically place more safety restrictions for users between the ages of 13 and 17. In April 2025, Meta began blocking teens under 16 from going live on Instagram. In September 2025, the company launched a school partnership programme giving educators expedited review of complaints such as cyberbullying.

Critics argue these changes are incremental. Platforms have not spent much time sharing important data with academic researchers who could study these questions more intensively. Without transparency, independent assessment remains limited.


What This Verdict Could Change

The verdict could shape hundreds of similar cases. TikTok and Snap, previously named in related litigation, have already reached settlements. If Meta and YouTube face significant damages in follow-on trials, the financial pressure to redesign their platforms increases substantially.

For parents and users, the legal precedent is clear: platforms bear some responsibility for how their products affect users — particularly minors. If you or someone close to you is struggling with compulsive social media use, that experience now has formal legal recognition.


Recognizing Media Addiction

Not every heavy user has a clinical dependency. But the following patterns warrant attention:

SignWhat it may indicate
Using the app despite wanting to stopLoss of behavioral control
Mood changes when access is restrictedPsychological dependence
Sleep disruption due to late-night scrollingPrioritization disorder
Declining real-world relationshipsSocial displacement
Anxiety when the platform is downDependency on external validation

If several of these apply, speaking with a mental health professional is a reasonable next step — not a dramatic one.


The Meta and YouTube verdict does not resolve the broader debate about social media regulation. But it establishes, for the first time in a jury trial, that a platform’s design choices can legally constitute harm. That changes the conversation.

Leave a Comment