Apple Called Google's AI Nonsense. Then Apple Bought It.

Apple signed a multiyear deal with Google to power Siri using Gemini AI after failed talks with Anthropic and OpenAI. Two major updates coming in 2026.

Apple's Siri Now Runs on Google Gemini; AI Strategy Shift

Last June, Mike Rockwell stood in front of Apple's foundation models team and dismissed a Bloomberg report as "bulls--t." The story claimed Apple was shopping for outside AI. Rockwell runs Siri. He had billion-dollar infrastructure to defend, hundreds of engineering jobs on the line. The denial landed hard.

Seven months passed. Then Apple signed a multiyear deal with Google. Gemini would power Siri. The company that built everything in-house, chips to software to services, had bought someone else's engine. Slapped its own badge on it. That almost never happens.

Siri earned this. A decade of misfires. Misunderstood commands. Answers to questions nobody asked. Features promised at keynotes, then quietly delayed, then forgotten. Apple outsourced the fix because the alternative was more embarrassment. That says something about where the company stands in AI, and what pride costs when you're behind.

The foundation models team knew the truth before leadership would admit it. Engineers began leaving within weeks of that June meeting, including Ruoming Pang, who ran the group. They could read the org chart as well as anyone. When your employer starts shopping for replacements, the denials ring hollow.

The Breakdown

• Apple signed a multiyear Gemini deal after failed talks with Anthropic and OpenAI

• Two Siri updates coming: iOS 26.4 in March and iOS 27 at WWDC June

• Google collects roughly $1 billion annually from the deal

• Apple AI talent drain continues as engineers leave for competitors


The negotiations nobody expected

Apple's path to Google ran through two failed deals.

Anthropic wanted several billion dollars annually over multiple years. The terms didn't work for Apple, which tends to squeeze suppliers rather than write blank checks. OpenAI posed different problems. The company had been poaching Apple engineers. Worse, Sam Altman was building hardware with Jony Ive, Apple's former design chief. Handing AI to a company competing for your talent and your hardware market? That math didn't work.

So picture this: Google wasn't even on the shortlist. Alphabet was tangled in an antitrust lawsuit partly involving its $20 billion annual search deal with Apple. Legal teams on both sides had spent months gaming out scenarios where the partnership might unwind. Then a September court ruling cleared the obstacle. The mood in Cupertino shifted within days.

By the time Apple circled back, Gemini had improved. The technology that seemed second-tier in spring looked competitive by fall. Google also accepted financial terms that Apple found reasonable, though neither company has disclosed specifics. One report suggests Apple will pay around $1 billion annually, a fraction of the search deal but still substantial.

The agreement finalized in November. Google will supply Gemini models for Siri and future Apple Intelligence features, initially running on Apple's Private Cloud Compute servers. Later versions will run directly on Google's infrastructure using its tensor processing units.

Join 10,000+ AI professionals

Get Implicator.ai analysis in your inbox every morning

Subscribe Free


The rebadging

Apple has masked the partnership with familiar branding maneuvers. Internally, the Gemini-powered system carries the name Apple Foundation Models version 10. Nothing in the marketing materials will mention Google. Roughly 1.2 trillion parameters. Hosted on Apple servers. Built in Mountain View. That's the arrangement.

Car companies do this all the time. Toyota engine in a Lexus body. Volkswagen platform under an Audi shell. The badge matters more than the engineering, at least to buyers. Apple is betting most people won't care. Some won't even notice.

For most users, that bet will pay off. You don't care who built the model; you care whether Siri can finally set alarms correctly. If Google's technology fixes those problems, the provenance becomes a footnote. The assistant will tap into personal data and on-screen content to complete tasks, features Apple promised at WWDC 2024 and failed to ship for twenty months.

The February demonstration will show what the rebadged engine can do. iOS 26.4 ships in March, maybe April.

Federighi's gamble

Craig Federighi runs Apple's AI now. His organization feels tense. Everyone knows the old strategy failed. Nobody's confident the new one works better. Federighi grabbed Siri. Grabbed the broader AI group. Tim Cook had soured on John Giannandrea, and Federighi saw an opening. Six years Giannandrea led machine learning. Then he didn't. Giannandrea was publicly pushed out in December, allowed to collect salary and equity through an April vesting date but effectively finished. Engineers who worked under him describe the transition as abrupt, the messaging inconsistent.

Federighi concluded that Apple couldn't catch up using internal models alone. The company's World Knowledge Answers project, intended to compete with ChatGPT and Perplexity, has been scaled back. Plans for an AI-powered Safari browser that could assess document trustworthiness are paused. Health-related AI features went back to the drawing board. The ambitious rollout of standalone chatbots embedded in apps like Music, Podcasts, and TV has shifted toward a unified Siri integration.

The bet is that users won't notice the difference between Apple-built and Google-built AI, but they will notice if Siri finally works. Federighi is trading intellectual property control for shipping deadlines. The foundation models team still exists, still develops on-device models, but the cloud infrastructure powering Siri's future carries a competitor's engineering under Apple's hood.

Two Siris, one year

Apple will ship two major Siri updates in 2026, and the difference between them reveals how deep the rebadging runs.

The February version, arriving in iOS 26.4, handles personal context, on-screen awareness, task completion. These features match what competitors delivered years ago. Apple Foundation Models version 10 powers this release, running on Apple servers with Google technology underneath. Ask Siri to find that restaurant your friend mentioned last week. The query hits Apple's Private Cloud Compute. Processes through Gemini-derived models. Comes back branded as Apple Intelligence. You'd never know Google touched it.

Then there's the bigger update. iOS 27, WWDC, June. Code-named Campos, this Siri features an entirely new architecture built for conversational interaction. It will handle sustained back-and-forth dialogue, maintain context across exchanges, and compete directly with ChatGPT and Gemini as chatbot products. Apple Foundation Models version 11 powers this version, expected to match Gemini 3 capabilities.

Here's where the rebadging gets complicated: Apple and Google are discussing running this version directly on Google's cloud infrastructure rather than Apple's servers. If that happens, your Siri queries would process on Google's tensor processing units. Apple has spent years arguing that on-device processing and Private Cloud Compute protect user data from exploitation. Moving queries to Google's infrastructure muddles that narrative. Neither company has clarified how data will flow or what protections apply.

Don't miss tomorrow's analysis

Join thousands of executives and strategists who start their day with Implicator.ai

Get Free Access


The engineers who stayed

Walk through Apple's AI research buildings and you'll find a deflated atmosphere. The strategic direction has become obvious to everyone working there: cloud models matter more than on-device models, third-party technology beats internal development, and the foundation models team works on problems that won't ship for years. Engineers describe feeling like mechanics at a dealership that just switched to selling someone else's cars.

A potential acquisition of an external model developer collapsed late in negotiations, which might have changed the calculus. With that deal dead and Federighi committed to Google, the near-term path involves implementation rather than invention. Apple will build interfaces and integrations while Google supplies the intelligence.

Some engineers stay because the on-device work remains genuinely interesting. Models running directly on iPhones and Macs will continue using Apple-developed technology, constrained by power and memory limits that don't apply in the cloud. But the prestige projects, the ones that might define the next decade of computing, now carry Google's name on the underlying research. The engineers who wanted to build engines find themselves installing someone else's.

Who wins either way

Alphabet's market capitalization passed Apple's for the first time since 2019 shortly after the partnership announcement. You could feel the shift in mood across both companies: Apple needed Google's help to compete in the technology that drove Google's stock higher.

Google collects roughly $1 billion annually from the deal regardless of whether Siri improves. If the new assistant fails, Apple absorbs the reputational damage while Google keeps the check. If Siri succeeds, Google can claim partial credit while Apple does the marketing. The financial structure favors the supplier over the brand.

For users, none of this matters much. The Siri blob will still appear when you hold the side button. The voice will still respond to queries. The experience might finally match the advertisements. Who deserves credit for that improvement? Depends what you mean by credit. Google wrote the code. Apple shipped the product. Pick your definition.

Apple chose shipping. The foundation models team learned that lesson last June, even when leadership called the reporting nonsense.

Frequently Asked Questions

Q: How much is Apple paying Google for Gemini?

A: Reports suggest Apple pays around $1 billion annually for the Gemini deal, a fraction of the roughly $20 billion Google pays Apple each year for Safari search placement.

Q: When will the new Siri be available?

A: Apple demonstrates the new Siri in February 2026. Public release in iOS 26.4 arrives March or April. A more advanced version debuts with iOS 27 at WWDC June 2026.

Q: What happened to John Giannandrea?

A: Giannandrea led Apple machine learning for six years before being pushed out in December 2025. Craig Federighi now controls Apple's AI direction.

Q: Does this affect Apple's privacy claims?

A: Initially Gemini runs on Apple's Private Cloud Compute servers. But Apple and Google are discussing running iOS 27 Siri directly on Google infrastructure, which muddles Apple's privacy narrative.

Q: What Apple AI projects got scaled back?

A: Apple scaled back World Knowledge Answers (its ChatGPT competitor), paused an AI Safari browser, sent Health AI back to the drawing board, and shifted from standalone app chatbots to unified Siri.

Perplexity's Voice Assistant Outpaces Siri, Now Available on iPhones
Good Morning from San Francisco, While Apple keeps polishing its promises, Perplexity just crashed the voice AI party. Their revamped iPhone app now handles tasks that once belonged to Siri's kingdom.
Can You Tell If It's ChatGPT or a Human on the Other End?
OpenAI updated ChatGPT's voice mode over the weekend to sound more natural with better emotional expression and realistic pauses.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Implicator.ai.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.