On Wednesday morning, Meta released a benchmark chart comparing Muse Spark against Claude, Gemini, and GPT. The model fell short on coding. It trailed on agentic benchmarks. It performed well on multimodal perception and health queries, categories that matter far less to the AI crowd tracking the frontier race. Every tech outlet ran the same angle. Meta's first model from its $14.3 billion Superintelligence Lab still can't match the leaders.
That framing misses what Meta actually shipped.
In the months before Muse Spark went live, the company assembled something more consequential than a chatbot. Last December, Meta activated a policy change allowing it to harvest AI conversations across Facebook, Instagram, Messenger, and WhatsApp as ad-targeting signals. Last month at Shoptalk, it launched affiliate marketing tools, one-tap checkout, and AI-powered product summaries designed to keep the entire purchase experience inside its apps. This week, alongside the model launch, it shipped nutrition tracking on smart glasses, turning the camera on your face into a food diary that feeds personalized recommendations through Meta AI.
None of that appears on a benchmark chart. All of it reveals where the $135 billion in planned 2026 capital spending is actually going.
The Argument
- Muse Spark trails on coding, but Meta's strategy targets commerce, personalization, and distribution over benchmarks
- Since December 2025, Meta uses AI conversations to personalize ads across all its apps, with no opt-out for most users worldwide
- Shoptalk 2026 tools create a closed shopping loop: creator affiliates, AI product summaries, and one-tap checkout inside Instagram
- Smart glasses extend data collection from screens to faces, making Meta's AI ambient and always-on
AI-generated summary, reviewed by an editor. More on our AI guidelines.
The model that doesn't need to win
Muse Spark was built by Meta Superintelligence Labs, the division Zuckerberg formed after hiring Alexandr Wang away from Scale AI in a deal reportedly worth north of $14 billion. The stated objective is personal superintelligence: AI that knows you well enough to anticipate what you need before you ask.
On benchmarks, Muse Spark Thinking scores competitively on multimodal perception, health reasoning, and entity recognition. It falls short on coding and long-horizon agentic work, the exact disciplines where Anthropic and OpenAI have concentrated their engineering firepower. Meta's own launch materials acknowledge "current performance gaps" in coding workflows and multi-step agents.
The same materials reveal priorities no benchmark captures. Future Meta AI answers, the company wrote, will weave in Reels, photos, and posts "with credit back to the content creators." Shopping mode will draw from "creators and communities people already follow." Health reasoning was developed with input from over 1,000 physicians. A private API preview for select partners signals monetization channels beyond consumer chat are already under construction.
If you read this as a model launch, Muse Spark is a respectable also-ran. Read it as the first consumer-facing layer of a personalization and commerce engine serving 3.58 billion daily active people, and the strategy sharpens. Meta doesn't need the best model. It needs the most distributed one.

Your confessions, their ad engine
The single most consequential move Meta made this past year had nothing to do with model architecture. Last October, the company announced it would begin using interactions with Meta AI to personalize content and ads across every app in its family, effective December 16.
"People's interactions simply are going to be another piece of the input that will inform the personalization of feeds and ads," Christy Harris, Meta's privacy policy manager, told Reuters.
Piece of the input. That phrasing disguises scale. More than one billion people use Meta AI monthly. Unlike a liked post or a watched Reel, an AI conversation reveals intent in real time. Unfiltered. About decisions the user is actively making. Ask Meta AI about hiking boots and Meta learns you're shopping. Ask about parenting anxiety at midnight and the system learns something far more personal.
No opt-out exists. The policy runs everywhere except the EU, UK, and South Korea, where privacy laws forced exceptions. Thirty-six organizations led by the Electronic Privacy Information Center petitioned the FTC to suspend the practice. The FTC has not responded. Meta's disclosure to users ran one line: "We'll start using your interactions with AIs to personalize your experiences and ads."
Privacy advocates are alarmed less by the policy than by its design. Preferences that dial back personalization need regular revisiting because they reset as usage patterns shift. The system defaults to collection. Limiting it means clicking through nested menus across multiple accounts, a process most users will never complete.
For Meta's ad business, the arithmetic is clean. Traditional signals, likes, shares, follows, report what a user already did. AI chat data reports what they're about to do. One is a rearview mirror. The other is a windshield. Meta just fitted the windshield across an app family reaching more daily users than any platform on earth.
Get Implicator.ai in your inbox
Strategic AI news from San Francisco. No hype, no "AI will change everything" throat clearing. Just what moved, who won, and why it matters. Daily at 6am PST.
No spam. Unsubscribe anytime.
The tollbooth between you and checkout
At Shoptalk last month, Meta unveiled commerce features that would have dominated any other news cycle. Instagram creators can now tag products directly in Reels and earn affiliate commissions through partnerships spanning Amazon, eBay, Temu, Mercado Libre, and Shopee across 22 countries. Facebook creators get the same capability in posts and photos.
Then comes the AI layer. Click an ad and Meta takes over the research you'd normally do on your own. Reviews, pricing, brand context, all pulled together inside the app. No tabs. No comparison shopping. One-tap checkout through PayPal and Stripe closes the sale right there, with Shopify and Adyen next in line.
Discovery. Evaluation. Purchase. The buyer never leaves Instagram.
Anthropic and OpenAI are building the most capable models they can. Meta is building a tollbooth. Every creator tag, every AI product summary, every in-app checkout adds another lane to a commerce road running through Meta's territory. The model collecting the tolls doesn't need to match Claude on code generation. It needs to know what you want before you search for it, surface a creator who already reviewed it, and close the sale before you open a browser tab.
Meta tried in-app affiliate commerce before. Instagram killed its first test in 2022 after about a year. What changed is that Meta's AI now stitches the components together: creator content, behavioral data, chat-derived purchase intent, all feeding a closed loop that didn't exist when the first experiment failed. Andromeda and GEM, the AI engines the company rebuilt over the past two years to power its ad stack, already handle 60 to 70 percent of some agencies' Meta spending through automated Advantage+ campaigns, according to Jeremy Schulkin, SVP of services at Hawke Media.
Zuckerberg has described a future where advertising on Meta requires nothing more than a credit card and a business objective. The AI handles targeting, creative, placement. Some agencies find this promising. Others feel cornered by a platform that keeps stripping away the controls they built careers around.
The data collector on your face
Meta's hardware play looks scattered until you see it through the tollbooth frame. This week, alongside Muse Spark, the company announced that Ray-Ban Meta and Oakley smart glasses can now track what users eat. The camera logs nutrition data, and Meta AI serves personalized food recommendations in return. New prescription-focused models launching this month are designed to push the product beyond the early-adopter crowd and into daily wear.
The glasses collect photos, video, and voice. Under the December policy change, as Proton's privacy team documented, all of that feeds into Meta's ad-targeting infrastructure. You don't need to open an app. The signal flows through the lenses on your face.
Meta's own product language about the glasses is telling. AI will "transition from something you have to prompt each time to a more continuous, in-the-moment assistant." Strip away the marketing and the description reads plainly: an always-on AI surface observing meals, surroundings, and conversations, funneling observations into a personalization engine where 3.58 billion daily users are already generating behavioral data at the other end.
Look at the money and you see the same ambition. Nearly two hundred billion in 2025 revenue. Then Meta told Wall Street to expect $115 to $135 billion in 2026 capital spending, almost double what it burned through the year before. CFO Susan Li attributed the jump to "Meta Superintelligence Labs efforts and core business." Both terms point the same direction. Advertising is the core business. Those labs exist to sharpen it. And glasses? They're the form factor that makes collection ambient.
The race nobody else can run
For Meta's rivals, this strategy has no clean counter. OpenAI has hundreds of millions of weekly ChatGPT users but owns no social graph, no creator affiliate network, no hardware collecting data from faces. Anthropic doesn't have consumer distribution at all. Google comes closest with Search and Android, but its ad engine still depends on sending users to external sites where the purchase happens off-platform. Meta's architecture runs in the opposite direction: keep users inside from first product glimpse to final checkout tap, gathering behavioral data at every step.
For users, the trade keeps expanding. Free apps in exchange for data has been the bargain since Facebook's founding. The new clause is that AI conversations, the kind people treat like private confessionals, are part of the deal. And for most of the world's internet users, the deal has no off switch.
Whether Zuckerberg's spending is rational or reckless won't be settled by a coding benchmark. It will be settled by a revenue line. Can Meta convert what may be the deepest behavioral dataset on the planet into enough ad dollars to justify $135 billion a year in infrastructure? Every piece of the Muse Spark rollout, the shopping tools, the privacy policy, the glasses, the creator affiliates, exists to make that conversion work.
When Meta posted the benchmark chart on Tuesday, it buried a capability detail that almost nobody picked up. Muse Spark excels at "entity recognition and localization." In plain language, the model is unusually good at identifying objects and places inside images and video.
Coding doesn't need that. Commerce does. It is precisely the skill required when a pair of smart glasses watches someone hold a product and the system needs to identify the item, find the price, and surface the creator who reviewed it last week.
The leaderboard says Muse Spark is behind. The toll road is already collecting.
Frequently Asked Questions
What is Meta's Muse Spark AI model?
Muse Spark is Meta's first AI model from its Superintelligence Labs, built for multimodal perception, health reasoning, and shopping recommendations. It powers the Meta AI assistant across Facebook, Instagram, WhatsApp, Messenger, and smart glasses.
How does Meta use AI conversations for advertising?
Since December 16, 2025, Meta uses interactions with Meta AI to personalize content and ads across all its apps. Chat topics inform ad targeting in real time, with no opt-out available outside the EU, UK, and South Korea.
What shopping features did Meta announce at Shoptalk 2026?
Meta launched creator affiliate links in Reels, AI-powered product summaries, and one-tap checkout through PayPal and Stripe. The tools keep the entire purchase process inside Instagram and Facebook.
How much is Meta spending on AI in 2026?
Meta guided $115 to $135 billion in 2026 capital expenditure, nearly double its 2025 spending, driven by AI infrastructure and Meta Superintelligence Labs.
AI-generated summary, reviewed by an editor. More on our AI guidelines.



IMPLICATOR