A Los Angeles jury on Wednesday in the first trial to test whether social media platforms can be held legally responsible for addicting children. The twelve-member panel awarded plaintiff K.G.M., a 20-year-old California woman, $3 million in compensatory damages after finding that design features including infinite scroll and algorithmic recommendations were a "substantial factor" in causing her depression, anxiety, and body dysmorphia. Meta owes 70 percent, YouTube the rest. Jurors went further, ruling that both companies had acted with "malice, oppression, or fraud." That finding means a separate hearing on punitive damages is coming.
Social media's legal shields are cracking on multiple fronts. A day before the Los Angeles verdict, a New Mexico jury ordered Meta to pay $375 million for letting predators reach children on Facebook and Instagram. Federal trials involving school districts are scheduled for this summer. More than two thousand individual child safety cases sit in courts nationwide. Three million dollars is pocket change for a company that pulled in two hundred billion last year. The precedent it sets is not.
Key Takeaways
- Los Angeles jury found Meta and YouTube negligent, awarding $3 million in the first social media addiction trial
- Jury also found "malice, oppression, or fraud," opening the door to punitive damages
- Internal 2015 slides showed Instagram had 4 million+ users under 13 with no age verification until 2019
- More than 100,000 individual arbitration claimants have filed addiction-related demands against Meta
How the plaintiff's lawyers won
K.G.M.'s legal team, led by attorney Mark Lanier, borrowed a theory from tobacco litigation and sharpened it for a courtroom in Los Angeles County. The companies did not simply host harmful content, the lawyers argued. They built the addiction into the product itself.
Section 230 of the Communications Decency Act has shielded platforms from liability for decades by protecting them from responsibility for user-posted content. Every major social media company facing child safety allegations has invoked it. Lanier's team sought to neutralize the defense by shifting the target from what appears on the platforms to how they are built. Infinite feeds that never run out. Autoplay that launches the next video before you've processed the last one. Push notifications engineered to pull users back onto the app. Lanier called these features a "Trojan horse," designed to manufacture "engineered addiction" in developing brains.
K.G.M. was six years old when she started watching YouTube. Instagram came at nine. Her attorney told the jury she had uploaded more than three hundred videos before turning 10. "Every single day, I was on it all day long," she told the court. Likes and notifications gave her an emotional "rush." Logging off sent her into panic. "I just felt like I wanted to be on it all the time, and if I wasn't on it, I felt like I was going to miss out on something." The Instagram filters were a separate problem. They warped how K.G.M. saw her own face and body. She dropped hobbies. Making friends at school got harder. Eventually she started cutting herself.
A former therapist named Victoria Burke, who treated K.G.M. in 2019, put it plainly. Social media and K.G.M.'s sense of self "were closely related," Burke told the court. What happened on the apps could "make or break her mood."
The legal team also surfaced damaging internal evidence. A set of 2015 Instagram slides showed the platform carried more than four million U.S. users under age 13, Gizmodo reported. Instagram didn't start asking users for their age until 2019. Four years of documented underage access to a product the company's own policies prohibited children from using.
"This is the first time in history a jury has heard testimony by executives and seen internal documents that we believe prove these companies chose profits over children," said Joseph VanZandt, one of K.G.M.'s attorneys. Lead counsel Mark Lanier told reporters he hoped the trial would produce transparency "so that the public can see that these companies have been orchestrating an addiction crisis in our country and, actually, the world."
What the executives said under oath
The six-week trial in Los Angeles County Superior Court forced some of the industry's most senior executives onto the witness stand. Their responses, more than once, complicated their own companies' positions.
Meta CEO Mark Zuckerberg acknowledged that some minors circumvent the platform's age rules but suggested it is "up to users to read the terms." When the plaintiff's lawyer asked whether addictive products tend to drive heavier usage, Zuckerberg said he wasn't sure how to respond. "I don't think that applies here," he told the court. Pressed on the point, he offered what he described as a "basic assumption" that "if something is valuable, people will use it more because it's useful to them." Scientific literature, he said, hasn't settled the question. But the jury also heard something else about Zuckerberg. He had at some point called Apple CEO Tim Cook to discuss teen wellbeing. And when it came to cosmetic surgery filters on Instagram, it wasn't just product teams making the call. Senior executives got involved. That gap between internal awareness and public messaging was exactly what K.G.M.'s lawyers wanted the jury to see.
Then there was Adam Mosseri. Sixteen hours a day on Instagram, the plaintiff's lawyer asked. Is that addiction? Mosseri wouldn't use the word. "Problematic use." He said profit comes from protecting kids, not exploiting them, and that he genuinely did not believe clinical addiction to social media was a real thing. The defense had a point in its favor there. No version of the DSM, psychiatry's diagnostic bible, recognizes social media addiction as an official condition. But a jury staring at a young woman who couldn't stop scrolling at age nine was not especially moved by the diagnostic gap.
YouTube's Cristos Goodrow, a vice president of engineering, told the court the platform was "not designed to maximize time." YouTube's lawyers tried a different angle entirely. They compared the service to television, not social media, and pointed to data showing K.G.M. barely used its infinite-scroll Shorts feature.
The defense that fell short
Meta and YouTube told a different story about K.G.M.'s suffering. Verbal abuse at home. Physical abuse. A turbulent childhood that, they argued, explained everything the plaintiff blamed on their products. Meta's Paul Schmidt played a video during closings that appeared to show K.G.M.'s mother yelling at her. His argument to the jury was simple: prove that taking Instagram away would have made K.G.M.'s life "meaningfully different."
Get Implicator.ai in your inbox
Strategic AI news from San Francisco. No hype, no "AI will change everything" throat clearing. Just what moved, who won, and why it matters. Daily at 6am PST.
No spam. Unsubscribe anytime.
"Not one of her therapists identified social media as the cause," a Meta spokesperson said during the trial. That was the company's position from start to finish.
The jury spent nine days going back and forth, more than forty hours locked in deliberation. At one point they told Judge Carolyn B. Kuhl they were stuck on one of the two defendants. Kuhl's response was brief: read the instructions aloud, then get back to work. The verdict came back split. Seven women and five men on the panel, ten for the plaintiff, two against. They found both companies knew their products were "dangerous" and had failed to warn anyone. Three million in compensatory damages, which wouldn't cover a week of Meta's advertising revenue.
But the dollar amount is almost beside the point. Punitive damages come next. And the legal theory behind the case, that platform design can constitute a defective product, has now survived a jury trial. "This is a breakthrough because it validates a new theory that platform design can be a defective product," said Kimberly Pallen, a litigation partner at law firm Withers who does not represent clients in these cases.
"We respectfully disagree with the verdict and are evaluating our legal options," a Meta spokesperson said. Google did not immediately respond to requests for comment.
Two courtroom losses in 24 hours
The Los Angeles verdict arrived less than a day after a New Mexico jury ordered Meta to pay $375 million in civil penalties. Attorney General Raul Torrez brought that case after prosecutors posed as children on Facebook and Instagram and documented the sexual solicitations they received. The jury found thousands of individual violations of New Mexico's Unfair Practices Act.
The harms were different, and so were the legal theories. But jurors in both courtrooms reached the same conclusion. Meta knew its platforms endangered children and failed to act. Meta said it disagrees with the New Mexico verdict and plans to appeal.
Outside the Los Angeles courthouse, families who say social media damaged their children embraced after the reading and told reporters they felt "vindicated." K.G.M.'s co-lead counsel issued a statement calling the decision "a referendum, from a jury to an entire industry, that accountability has arrived."
The litigation pipeline
The Big Tobacco comparison has currency because the underlying mechanics match. Philip Morris and R.J. Reynolds settled with more than 40 states for $206 billion in 1998. Marketing restrictions followed. Smoking rates dropped. Now attorneys who cut their teeth on the opioid litigation against pharmaceutical distributors are working the plaintiff's side in social media cases. Jayne Conroy, a lawyer on the plaintiffs' trial team in the upcoming federal case, was blunt about the parallel. "These companies knew about the risks, they have disregarded the risks, they doubled down to get profits from advertisers over the safety of kids," she told the Associated Press. "And kids were harmed and kids died."
Sixteen hundred plaintiffs are already lined up in coordinated California proceedings. Over 350 families. Two hundred and fifty school districts. A federal case lands this summer in front of U.S. District Judge Yvonne Gonzalez Rogers in Oakland, with six school districts going first. And buried in Meta's 2026 annual filing is a number worth pausing on: attorneys for more than one hundred thousand individual claimants have submitted addiction-related arbitration demands since late 2024.
Eight more individual plaintiff trials are scheduled in Los Angeles this year. Appeals from this week's verdicts alone will stretch for years. Full resolution, if the tobacco and opioid timelines are any guide, will consume the better part of a decade. And unlike Europe, where regulators have already charged TikTok with addictive-design violations under the Digital Services Act, U.S. federal regulation of social media remains stalled.
"While Meta has doubled down in this area to address mounting concerns by rolling out safety features, several recent reports suggest that the company continues to aggressively prioritize teens as a user base and doesn't always adhere to its own rules," said eMarketer analyst Minda Smiley.
The punitive damages hearing begins next. Eight more trials follow this year. The question of whether social media design can cause personal injury is no longer theoretical. A jury in Los Angeles answered it on Wednesday.
Frequently Asked Questions
What was the verdict in the social media addiction trial?
A jury found Meta and YouTube negligent and awarded $3 million in compensatory damages to plaintiff K.G.M., a 20-year-old woman who said social media caused her depression and body dysmorphia. The jury also found both companies acted with malice, triggering a punitive damages hearing.
What is Section 230 and why didn't it protect Meta and YouTube?
Section 230 shields platforms from liability for user-posted content. The plaintiff's lawyers sidestepped it by targeting how the platforms were designed, specifically features like infinite scroll and autoplay, rather than what users posted on them.
How does this case compare to Big Tobacco lawsuits?
Both cases argue companies knew their products were addictive and harmful but prioritized profits. Attorneys from the opioid litigation are now working plaintiff-side in social media cases, using similar arguments about corporate knowledge and concealment.
How many similar lawsuits are pending against social media companies?
Over 1,600 plaintiffs are in coordinated California proceedings, and more than 100,000 individual arbitration demands have been filed. Eight more trials are scheduled in Los Angeles this year, plus a federal case this summer in Oakland.
What happens next in the K.G.M. case?
The jury will determine punitive damages, which could significantly increase the payout beyond the $3 million compensatory award. Meta has indicated it plans to challenge the verdict.



IMPLICATOR