Photo:
The core claim of this case is that beyond social content itself, engagement-driven design contributed to compulsive use and worsening mental health outcomes. Photo: iStock
Key takeaways
- A closely watched Los Angeles trial is examining whether social media platforms can be held responsible for alleged harms to users who were minors at the time.
- The case focuses on platform design features rather than content, with plaintiffs arguing that engagement mechanics contributed to compulsive use.
- The outcome could influence how courts, regulators and tech companies approach youth safety and product design.
- Meta and YouTube deny the allegations, pointing to existing safeguards and disputing claims of intentional harm.
Opening statements this week in a Los Angeles County courtroom marked the beginning of a trial that many legal observers view as a potential turning point for social media and families. Meta, the parent company of Instagram, and Google’s YouTube now face a jury for the first time over allegations that their platforms were deliberately designed in ways that addict and harm children.
At the center of the case is a 20-year-old plaintiff, identified only as “Kayley G.M.,” whose claims stem from harms alleged to have occurred while she was still a minor. Her experience is being used as a test case meant to gauge how similar lawsuits might unfold. Thousands of related claims are waiting in the wings, making the outcome of this trial far more significant than a single dispute.
Attorneys for the plaintiffs argue that the platforms rely on design features engineered to exploit adolescent psychology. In opening arguments, lawyer Mark Lanier framed the case as one about “addicting the brains of children,” pointing to mechanisms such as “like” buttons and algorithmic feedback loops that tap into a teenager’s need for social validation. The lawsuit compares these tactics to techniques usually associated with gambling products and tobacco marketing.
“They didn’t just build apps, they built traps,” Lanier said in opening statements this week. “They didn’t want users, they wanted addicts.”
For parents, the implications are hard to ignore. Unlike earlier legal battles over online content, this case focuses not on what children see, but on how platforms function. The core claim is that beyond social content itself, engagement-driven design contributed to compulsive use and worsening mental health outcomes.
That distinction matters legally. Tech companies have long leaned on First Amendment protections and Section 230, a law shielding platforms from liability for user-generated content. Plaintiffs in the current case are arguing that the harm stems from product design decisions rather than speech.
Meta and Google dispute the allegations. Both companies maintain that they have invested heavily in youth safety tools and deny that their products are intentionally harmful. Company representatives describe the claims as inaccurate and say the evidence will demonstrate longstanding efforts to support younger users.
The trial is expected to last several weeks and is only the first of many cases scheduled this year. Additional lawsuits involving social media and children, including actions brought by states and school districts, are moving through courts across the country.
Beyond the United States, governments are increasingly stepping in. Several countries have advanced or enacted age-based restrictions on youth social media use, reflecting a broader global debate about technology, mental health and child safety.
More news on kids and social media: |