The jury in Santa Fe needed less than a day.

After nearly seven weeks of testimony from 40 witnesses and hundreds of internal documents, a jury of New Mexico residents concluded on Tuesday that Meta had willfully endangered children and concealed what it knew about sexual exploitation on its platforms. The jury also found that Meta engaged in what state law calls "unconscionable" trade practices. The penalty: $375 million, the maximum $5,000 per violation across two counts involving 37,500 affected teens. That figure represents roughly one-quarter of New Mexico's teen population, according to the most recent census data.

Meta's stock rose 5% in after-hours trading.

That gap between verdict and market reaction tells you where this fight actually lives. Not in the fine. Against Meta's $201 billion in 2025 revenue, three hundred seventy-five million dollars works out to 0.19%. Investors ran the numbers and decided the figure was, in practical terms, nothing.

The arithmetic checks out. What they're missing is what New Mexico just built.

The Breakdown

The playbook that cracked Section 230

Section 230 of the Communications Decency Act gave social media companies a legal shield that held for three decades. Sue them for what their users posted and your case died on procedural motions. Courts barely blinked.

New Mexico Attorney General Raúl Torrez refused to fight on those terms. His office's 2023 lawsuit didn't argue that Meta was liable for content. It argued the platforms themselves were defective products. Defective in their recommendation algorithms, which surface harmful material to minors. Defective in design choices built to maximize engagement. And defective in the 2023 decision to encrypt Facebook Messenger, which prosecutors argued cost dearly. How dearly? The National Center for Missing and Exploited Children tracks these reports from platforms. After Messenger went encrypted, prosecutors argued, roughly two-thirds of Meta's reports to the center stopped.
Meta's response? Dismiss the whole thing. Its lawyers cited Section 230 and the First Amendment in May 2024. The judge rejected both, ruling the case went after platform design, not user speech.

That procedural ruling was the inflection point. Tuesday's verdict confirmed it holds up in front of a jury.

Torrez's team built their evidence on an undercover sting dubbed "Operation MetaPhile." Investigators created decoy accounts on Facebook and Instagram posing as users younger than 14. Adult users flooded those accounts with sexually explicit messages. Solicitations piled up fast. By May 2024, investigators had arrested several men from the sting. Two of them had driven to a motel, certain a 12-year-old was waiting.

The engineers inside Meta already knew. Former engineering leader Arturo Béjar put it bluntly in court testimony. "The product is very good at connecting people with interests," he said. "And if your interest is little girls, it will be really good at connecting you with little girls."

More than 40 state attorneys general have filed lawsuits against Meta over child safety, pursuing a range of legal theories including products liability and consumer protection. New Mexico just proved the products-liability approach works in front of a jury.

The evidence Meta can't take back

The trial's most damaging testimony came not from prosecutors but from Meta's own people.

Brian Boland spent nearly 12 years as a Meta vice president. His courtroom testimony left little room for interpretation. By the time he walked out in 2020, he told the jury, he "absolutely did not believe that safety was a priority" for Zuckerberg or then-COO Sheryl Sandberg. Béjar's own daughter, 14 at the time, received unwanted sexual advances on Instagram. He spent years raising alarms with executives. They chose not to act.

Internal research presented at trial showed that 19% of Instagram users aged 13 to 15 had reported being shown unwanted nudity or sexual activity in a single week. Witnesses from the National Center for Missing and Exploited Children testified that Meta's AI-driven content moderation generated floods of "junk" reports, useless to investigators. Actionable intelligence buried in noise.

Zuckerberg's recorded deposition made it worse. Played on a screen in the Santa Fe courtroom, his testimony described the company's research on whether its platforms are addictive as "inconclusive." Prosecutors pointed to Meta's own internal studies showing multiple product features designed to trigger dopamine responses and increase time on the app. When asked whether he, as a parent, would want to know if a product his child used was addictive, Zuckerberg said there was "a lot to unpack in that."

The jurors found enough to unpack on their own. Less than a day of deliberation. Liable on every count.

Here is what should make Meta's legal team anxious. Much of the trial testimony, including depositions, internal emails, and internal research findings, entered the court record through open proceedings. Plaintiff's attorneys in those 40-plus AG cases, and in the pending federal multidistrict litigation in Northern California, now have a detailed map of what to subpoena and what it will reveal. The trial didn't just produce a verdict. It produced a roadmap for future litigation.

The next phase is what actually matters

If you're tracking this story for its business implications, watch for additional proceedings scheduled for May.

That's when the second phase of New Mexico's case is expected to begin. A bench trial before Judge Bryan Biedscheid, no jury. The question shifts from liability, already established, to remedy. The state will argue that Meta created a public nuisance and should be ordered to implement specific product changes: effective age verification, removal of known predators, and modifications to encrypted messaging that currently shields exploitation from investigators.

Financial penalties are absorbable. Meta can write checks indefinitely. A court ordering you to redesign your product is a different animal.

Here is what that means in practice. Say a New Mexico judge orders age verification on Instagram. Other states will file for the same under their own consumer protection laws. If encrypted messaging modifications are required in one jurisdiction, how does Meta maintain a different product architecture for the remaining 49? The compliance geometry becomes unmanageable fast.

This is the structural risk the stock price isn't reflecting. Not one verdict in one state. A patchwork of court orders, each potentially demanding different product changes, emerging from a legal playbook that has now been jury-tested and validated.

The pressure is converging from multiple directions. In Los Angeles, a separate jury has deliberated for over a week on whether Meta and YouTube are liable for addictive design. Snap and TikTok settled before that trial began. AI companion platforms have faced their own legal reckoning over harm to teenagers. In Brussels, regulators have gone after TikTok on similar grounds, charging the company with violating digital safety law over addictive design. Every jurisdiction is moving at once.

The fine was never the point

Meta called it a disagreement. An appeal is coming, the spokesperson said. Measured language. Almost formulaic. But the polished language masks the real damage. Internal documents and whistleblower testimony are now public. A jury validated the legal theory. That combination hands every plaintiff's attorney in the country what they've been waiting for. The next phase of the trial won't ask how much Meta should pay. It will ask what Meta should change.

Emboldened state attorneys general now have a tested playbook and a body of reusable evidence. Meta's defense that it discloses risks and invests in safety, the same defense it ran in Santa Fe, will face the same internal documents and the same whistleblower testimony in courtroom after courtroom. The company can appeal one verdict. It cannot appeal the pattern.

Return to that after-hours stock movement. Up 5%. Investors looked at $375 million against a $1.5 trillion market cap and calculated a rounding error. Fair enough.

But the jurors in Santa Fe didn't produce a fine. They produced a proof of concept. The playbook works. Section 230 can be sidestepped. Juries will convict. And when the bench trial begins, a judge will decide whether the consequence for Meta is a check or a redesign.

The money New Mexico won is noise. The playbook it gave to 40 other states is signal.

Frequently Asked Questions

Why did Meta's stock rise after a $375 million verdict?

The fine amounts to 0.19% of Meta's $201 billion in 2025 revenue. Investors treated it as financially immaterial. The structural risk from court-ordered product changes and 40-plus pending AG lawsuits is harder to price but far more consequential.

How did New Mexico get around Section 230?

Attorney General Raúl Torrez argued the platforms themselves were defective products, targeting design choices and recommendation algorithms rather than user-posted content. A judge ruled Section 230 doesn't shield platform architecture decisions, and the jury validated that theory.

What is the May bench trial about?

The second phase shifts from liability (already established) to remedy. Judge Bryan Biedscheid will decide whether Meta must implement specific product changes like age verification, predator removal tools, and modifications to encrypted messaging.

What was Operation MetaPhile?

An undercover sting where New Mexico investigators created decoy accounts posing as users younger than 14. Adult users quickly flooded the accounts with sexual messages, and arrests followed by May 2024, including men who drove to a motel expecting to meet a child.

Can other states use this verdict against Meta?

More than 40 state attorneys general have filed lawsuits over child safety. The New Mexico trial produced public testimony, internal documents, and a validated legal theory that other states can reference and build on in their own cases.

OpenAI and Google Employees File Brief for Anthropic as DOD Feud Risks $5 Billion
Thirty-seven OpenAI and Google DeepMind employees filed an amicus brief Monday backing Anthropic's lawsuit against the Department of Defense, according to court records first reported by WIRED. Google
News Ticker: Anthropic Takes Pentagon To Court Over AI Restrictions
9:42 AM PT — March 9, 2026 Anthropic Takes Pentagon To Court Over AI Restrictions Anthropic filed suit Monday in a California federal district court, becoming the first American company to lega
Anthropic Sues Pentagon Over Supply Chain Risk Label, Citing First Amendment Violations
Anthropic filed two federal lawsuits on Monday challenging the Pentagon's decision to designate the AI company a supply chain risk, according to court filings in San Francisco and Washington, D.C. Bot
Analysis

Los Angeles

Tech culture and generative AI reporter covering the intersection of AI with digital culture, consumer behavior, and content creation platforms. Focusing on technology's beneficiaries and those left behind by AI adoption. Based in California.