💡 TL;DR - The 30 Seconds Version
📱 Apple considers using Anthropic's Claude or OpenAI's ChatGPT to power Siri instead of its own AI models, Bloomberg reports.
💰 Anthropic wants multibillion-dollar annual fees that increase sharply each year, creating a financial roadblock in negotiations.
🚪 Apple's top AI researcher Tom Gunter left after 8 years, while the MLX team nearly quit before counteroffers kept them.
📊 Meta offers AI engineers $10-40 million annually while Apple typically pays half or less than market rates.
⏰ Apple delayed its AI-powered Siri features from 2025 to 2026, forcing management changes after AI chief John Giannandrea was sidelined.
🔄 The move would mark Apple's first major departure from building everything in-house, signaling how far behind it has fallen in AI.
Apple is considering using artificial intelligence from Anthropic or OpenAI to power Siri instead of its own technology, Bloomberg News reported Monday. The move would mark a major reversal for a company that builds everything in-house.
The iPhone maker has talked with both companies about using their language models for Siri. Apple asked them to train versions that could run on Apple's cloud servers for testing. Current plans call for Apple's own models to power a new Siri in 2026.
A switch to outside technology would be an admission that Apple is struggling in AI. The company already lets ChatGPT answer some Siri questions, but the assistant itself runs on Apple's technology.
Testing Shows Apple's AI Falls Short
Siri chief Mike Rockwell and software head Craig Federighi started the project to evaluate outside models. They took over Siri duties after AI chief John Giannandrea was sidelined following delays with Apple Intelligence features.
After testing multiple rounds, Apple executives concluded that Anthropic's technology works best for Siri's needs. That led corporate development VP Adrian Perica to start talks with Anthropic about using Claude.
The Siri assistant has fallen behind popular AI chatbots since its 2011 launch. Apple unveiled new Siri features last year that would tap into personal data and control apps better. The company initially planned an early 2025 release but delayed it indefinitely. Those features are now planned for next spring.
Money Talks Create Roadblocks
Apple and Anthropic disagree over money. The AI startup wants multibillion-dollar annual fees that increase sharply each year. The struggle to reach a deal has Apple considering OpenAI or others if it moves forward with outside technology.
This approach mirrors Samsung's strategy. While Samsung brands features under Galaxy AI, many actually use Google's Gemini. Anthropic already powers Amazon's new Alexa.
Apple executives increasingly think embracing outside technology is key to a quick turnaround. They don't see the need to rely on their own models when they can partner with third parties instead.
Internal Team Faces Morale Crisis
The proposed shift has hurt morale among Apple's AI team. Some members feel blamed for the company's AI problems and say they might leave for multimillion-dollar packages from Meta and OpenAI.
Meta offers some engineers annual pay between $10 million and $40 million to join its new Superintelligence Labs group. Apple typically pays its AI engineers half or less than what they can get elsewhere.
Tom Gunter, one of Apple's top language model researchers, left last week after eight years. Colleagues see him as hard to replace given his skills and competitors' willingness to pay more for talent.
Apple also nearly lost the team behind MLX, its system for developing machine learning models on Apple chips. After the engineers threatened to leave, Apple made counteroffers to keep them.
Technical Details and Privacy Focus
In talks with both companies, Apple requested custom versions of Claude and ChatGPT that could run on Apple's Private Cloud Compute servers. These use high-end Mac chips that currently operate Apple's own models.
Apple believes running models on its own chips in Apple-controlled cloud servers will better protect user privacy than relying on third-party infrastructure. The company has already tested this approach internally.
Other Apple Intelligence features use AI models that live on devices. These slower, less powerful versions handle tasks like summarizing emails and creating custom emojis.
Apple plans to open the on-device models to third-party developers later this year. The company hasn't announced plans to give apps access to cloud models, partly because servers don't yet have capacity for many new features.
Management Shake-up Continues
If Apple strikes a deal, Giannandrea's influence would continue to shrink. Besides losing Siri, he was stripped of responsibility for Apple's robotics unit. The Core ML and App Intents teams also moved to Federighi's software engineering group.
Apple's AI team had been building language models to help write code in Xcode programming software. The company killed that project about a month ago. Instead, Apple will roll out new Xcode that can tap into third-party programming models. Developers can choose from ChatGPT or Claude.
Apple has approved a multibillion-dollar budget for 2026 to run its own models through the cloud. But plans beyond that remain unclear as executives weigh different directions.
Some engineers fear that moving to third-party technology for Siri could signal a broader shift away from in-house models for other features. Last year, OpenAI offered to train on-device models for Apple, but the iPhone maker wasn't interested.
Looking Ahead
Apple continues working on projects that will use AI heavily, including a tabletop robot and glasses. The company has also considered buying Perplexity to boost its AI work and briefly talked with Thinking Machines Lab, the startup founded by former OpenAI chief technology officer Mira Murati.
The AI industry's most sought-after talent is in high demand. Apple's foundation models team, led by distinguished engineer Ruoming Pang, includes about 100 people. Many could command much higher salaries elsewhere.
Whether Apple chooses outside help or doubles down on internal development, the company faces pressure to deliver AI features that match competitors. The decision could reshape how Apple approaches artificial intelligence for years to come.
Why this matters:
- Apple's willingness to outsource Siri reveals how far behind it has fallen in the AI race, forcing the company to abandon its vertical integration playbook for the first time in decades.
- The talent exodus and internal turmoil suggest Apple's AI problems run deeper than technology—they reflect a crisis of confidence that money alone can't fix.
❓ Frequently Asked Questions
Q: Does Apple already use outside AI for Siri?
A: Yes, but only partially. Apple lets ChatGPT answer web search queries in Siri, but the assistant itself runs on Apple's own technology. This proposal would replace Siri's core AI brain entirely.
Q: How much does Apple typically pay AI engineers compared to competitors?
A: Apple pays AI engineers about half or less than market rates. Meta offers packages between $10-40 million annually for top talent, while OpenAI also throws around multimillion-dollar offers to poach researchers.
Q: What is Apple's Private Cloud Compute and why does it matter?
A: It's Apple's cloud infrastructure built on high-end Mac chips. Apple wants to run outside AI models on these servers instead of third-party clouds like AWS to maintain better control over user privacy.
Q: Who is Mike Rockwell and why did he take over Siri?
A: Rockwell previously launched the Vision Pro headset and became Siri chief in March 2025. He took over after AI head John Giannandrea was sidelined due to delays with Apple Intelligence features.
Q: What is MLX and why did the team almost quit?
A: MLX is Apple's open-source framework for running machine learning on Apple Silicon chips. The entire team threatened to quit over compensation, forcing Apple to make emergency counteroffers to keep them.
Q: How big is Apple's internal AI team?
A: Apple's foundation models team has about 100 people, led by distinguished engineer Ruoming Pang who joined from Google in 2021. This is relatively small compared to AI teams at other major tech companies.
Q: What happened to Apple's Swift Assist coding project?
A: Apple killed Swift Assist about a month ago after announcing it last year. The project would have helped developers write code in Xcode using Apple's own AI models. Instead, developers can now choose ChatGPT or Claude.
Q: When might we see the new AI-powered Siri?
A: Apple's enhanced Siri features are delayed until spring 2026 in iOS 26. Any version using outside AI models would come later, possibly with iOS 27 in fall 2026.