California Governor Gavin Newsom signed an executive order on Monday requiring artificial intelligence companies to prove they have safety and privacy protections in place before winning state contracts, the governor's office announced. The order directs the Government Operations Agency to build new vetting processes within four months, covering bias prevention, child safety content policies, and civil rights protections. It picks a fight with the White House. Trump's December executive order created a Justice Department task force to sue states over AI regulation, calling state rules an obstacle to American competitiveness.

"While others in Washington are designing policy and creating contracts in the shadow of misuse, we're focused on doing this the right way," Newsom said in a statement.

Key Takeaways

What companies must prove

AI companies that want California business will need to explain how their technology prevents the exploitation or distribution of illegal content, including child sexual abuse material. They must demonstrate that their models avoid harmful bias. And they need to detail policies preventing "unlawful discrimination, detention, and surveillance," according to the executive order's text.

These are disclosure mandates, not guidelines. Companies that cannot show compliance will lose access to the largest state purchasing operation in the country. California already has automated-decision-system regulations in effect since October 2025, applying anti-discrimination law to AI tools used in employment decisions. The executive order extends that regulatory logic to all state procurement.

State technology officials will also develop best practices for watermarking AI-generated images and manipulated video produced by government agencies. First in the nation.

California breaks from federal procurement

One provision stands above the rest. California will now conduct independent supply-chain risk assessments, ignoring federal determinations when it sees fit. If Washington labels an AI company a security threat, Sacramento can investigate on its own and keep contracting with that company anyway.

The clause carries real weight right now. The Pentagon terminated its contract with Anthropic earlier this month after the AI company refused to let its models be used for autonomous lethal warfare and mass domestic surveillance. Anthropic sued the federal government, alleging retaliation: Trump ordered all agencies to drop the company's technology, despite the Defense Department having already accepted Anthropic's safety conditions, according to the company's lawsuit.

Anthropic goes unmentioned in the order. But read the procurement clause and the intent becomes obvious. Any AI company cut from federal contracts over policy objections, not actual security threats, can still sell to California.

A state-level revolt

Sacramento isn't acting alone in this. States across the country have passed more than 100 AI-related laws, from chatbot protections for children to copyright safeguards for publishers, according to the New York Times. Every month, another state pushes back.

Trump has tried to slow them down. His December executive order declared that "excessive state regulation thwarts" American AI leadership and directed the Justice Department to challenge state laws it considers obstructive. The White House sent a warning letter to Utah legislators in February that derailed an AI transparency and child safety bill. In Florida, Governor Ron DeSantis, a Republican, failed to pass his own AI legislation after the administration intervened.

But California kept stacking. State Senator Scott Wiener's SB 53 puts its own pressure on large AI developers. They have to build safety frameworks, publish transparency reports, and tell the state when something breaks. Wiener tried once before, in 2024. Newsom vetoed it, said it wasn't flexible enough. The reworked version took effect January 1. Monday's order takes a different angle. Not legislation this time. Purchasing power.

Why this fight is about money

The bet is that California's sheer economic weight can force what courts and legislation have not. Fourth-largest economy on the planet. Home to 33 of the top 50 privately held AI companies. Between Q3 2024 and Q2 2025, the Bay Area pulled in 51% of all U.S. AI startup funding tracked by Carta. New York drew 11%. Boston pulled 5.5%. For AI talent, California claimed 15.7% of all U.S. job postings in 2024, more than Texas and New York put together, according to the Stanford AI Index.

Walking away from California contracts is not a realistic option for most AI vendors. The math doesn't work.

There's a public engagement piece tucked in here too. "Engaged California," the digital platform the state built for the 2025 LA wildfires, goes statewide. The focus this time is how AI reshapes work. Build language models for a living? Stock a warehouse? Sacramento wants your take. Expect the platform to launch in coming months.

Four months. That's the deadline for the Government Operations Agency to turn this executive order into binding procurement rules. Companies in San Francisco and Mountain View are already parsing the text. And their lawyers. The question sitting with Washington is whether a litigation task force assembled last December can outrun a state that controls a quarter of America's AI economy.

Frequently Asked Questions

What does Newsom's executive order require from AI companies?

AI companies seeking California state contracts must demonstrate policies preventing illegal content distribution, harmful bias, and unlawful discrimination. The Government Operations Agency has four months to build formal vetting processes.

How does the order relate to the Anthropic-Pentagon dispute?

The order allows California to conduct independent supply-chain risk assessments separate from federal determinations. If the federal government freezes out a company like Anthropic over policy disagreements, California can investigate independently and keep working with that vendor.

What is Trump's position on state AI regulation?

Trump signed an executive order in December 2025 declaring that excessive state regulation threatens American AI leadership. He directed the Justice Department to create a task force to challenge state AI laws through litigation.

What is SB 53 and how does it connect to this order?

SB 53, authored by state Senator Scott Wiener, requires large AI developers to create safety frameworks, publish transparency reports, and report critical incidents. It took effect January 1, 2026. The new executive order adds procurement requirements on top of that law.

Why is California's economic position relevant to AI regulation?

California is the world's fourth-largest economy and hosts 33 of the top 50 privately held AI companies. The Bay Area captured 51% of U.S. AI startup funding between Q3 2024 and Q2 2025. Most AI companies cannot afford to walk away from California contracts.

Anthropic Faces Friday Deadline to Drop AI Safeguards or Lose Pentagon Contract
Defense Secretary Pete Hegseth told Anthropic CEO Dario Amodei on Tuesday that the military must have unrestricted access to Claude by Friday evening, Axios reported. The alternative: the Pentagon wil
Beijing Greenlights Nvidia Orders. The Chips Haven't Moved.
Chinese regulators just told Alibaba, Tencent, and ByteDance they can start preparing H200 purchase orders. Nvidia jumped 2.3% in premarket on the news Thursday, investors pricing in a deal that hasn'
Pentagon Targets Anthropic. India Writes the Checks.
San Francisco | Tuesday, February 17, 2026 The Pentagon is close to labeling Anthropic a supply chain risk. Defense Secretary Pete Hegseth wants Claude available for "all lawful purposes." Anthropic
Politics
Marcus Schuler

Marcus Schuler

San Francisco

Tech translator with German roots who fled to Silicon Valley chaos. Decodes startup noise from San Francisco. Launched implicator.ai to slice through AI's daily madness—crisp, clear, with Teutonic precision and sarcasm. E-Mail: [email protected]