Google Squints. Trump Taxes Air.
San Francisco | December 9 Sergey Brin is back with glasses that can't handle a sunny window. Google announced
Google announced AI glasses for 2026. But the real story is the third product: wired XR glasses running Samsung's $1,800 headset chip in a pocketable form. Google doesn't want to sell you glasses. It wants Android on your face.
Sergey Brin stood on stage at Google I/O in May and did something unusual for a tech executive. He admitted failure. Google Glass, he conceded, suffered from immature AI, supply chain inexperience, and prices that made consumers wince. A decade of ridicule had apparently taught the company something.
Now Google is back with a different approach. On Monday, the company announced two categories of AI glasses arriving in 2026, built with partners Samsung, Warby Parker, and Gentle Monster. But buried in the announcements was a third product that reveals far more about Google's actual intentions: Project Aura, a pair of wired XR glasses from Xreal that runs the same Snapdragon XR2+ Gen 2 chip powering Samsung's $1,800 Galaxy XR headset. Same processing power. Fraction of the bulk.
Google wants to put a screen on your face. The question is which screen you'll accept.
The Breakdown
• Google launching three glass tiers in 2026: audio-only, in-lens display, and wired XR running Samsung's $1,800 headset chip
• Project Aura delivers full Android XR apps in glasses form, but requires a processing puck clipped to your belt
• Raxium's Micro LED displays failed in sunlight during demos. Google promises fixes. The tech currently loses to the sun
• Platform strategy: one Android XR ecosystem across all devices, betting developers will choose reach over Meta's fragmented stack
Google's glasses break into three tiers, each targeting different tolerance levels for looking like a tech bro at brunch.
The first tier: audio-only AI glasses. Speakers, microphones, and cameras packed into frames that could pass for regular eyewear. No display. Users talk to Gemini, ask questions about their surroundings, take photos. If this sounds familiar, it should. Meta's Ray-Ban glasses follow the same template and have sold well. Meta reported in October that the newest model sold out "in almost every store within 48 hours."
The second tier adds an in-lens display. Google showed journalists prototypes in both monocular and binocular configurations. The monocular version places a single Micro LED screen in the right lens, positioned slightly below the natural sightline. Navigation arrows from Google Maps. Translation captions during conversations. Photo previews without pulling out your phone. The binocular version doubles the display real estate and enables 3D content.
Both tiers connect wirelessly to smartphones for processing. That's the trade-off enabling slim frames.
The third tier abandons that constraint entirely. Project Aura, developed with Chinese glasses maker Xreal, tethers to an external processing puck containing the same chip that powers Samsung's full XR headset. The specs are heavy: 70-degree field of view, full hand and room tracking. It runs every Android XR app available on Galaxy XR. Journalists who tested the prototype reported playing Demeo, a VR tabletop game, using hand gestures in mid-air. They streamed Windows desktops wirelessly and controlled them with finger movements. One tester circled a floor lamp with a gesture, triggering Google's Circle to Search.
The puck requires a tethered connection. Always. But it clips to your belt, and the glasses fold into a jacket pocket. That's a fundamentally different proposition than lugging a ski-goggle headset through an airport.
Google bought display startup Raxium in 2022. Nobody outside Mountain View knew exactly why. Apple was rumored to be building a headset, so maybe it was defensive. Maybe Google just wanted the talent. Three years of silence, and now the acquisition finally has a product attached to it.
The display glasses run on Raxium's Micro LED tech. The pitch for Micro LED has always been brightness. Pack more light into a smaller space than OLED can manage, and you get a display that works outdoors. That's the theory. Glasses you can actually wear in daylight without squinting at a dim overlay.
The prototype didn't deliver. Journalists at the demo found the display washed out when they looked toward a window. Google says production models will fix this. But right now, the tech loses to the sun.
The binocular prototype hints at where this technology leads. Google showed a dual-display version that's "not much bigger in size" than the single-display model, suggesting a potential 2027 release. If the company can deliver two Micro LED displays in frames that don't make you look like a welder, it would represent a genuine advancement over Meta's current offerings.
Meta's $799 Ray-Ban Display glasses, released in September, require a separate accessory called the Neural Band, a wristband that tracks hand gestures through electrical signals in the user's arm. Google's watch integration offers an alternative. Users with Android watches can tap their wrist to control the glasses, or glance at the watch to preview photos taken through the glasses' camera. Neither approach has proven superior. Both companies are still guessing.
The real shift is in the software. Android XR runs on the $1,800 Galaxy XR headset. It runs on Project Aura's wired glasses. It will run on the consumer frames coming next year. One operating system across all of it, which means one app ecosystem.
This matters because it inverts the typical consumer electronics launch sequence. Instead of releasing hardware and hoping developers build for it, Google launched Galaxy XR in October with immediate access to Android's existing app ecosystem. The company claims that "a ton of Android apps will just work at launch without developers having to lift a finger." The glasses simulate app interfaces automatically.
For developers willing to optimize, Google released Developer Preview 3 of the Android XR SDK on Monday, opening up APIs specifically for glasses. Partners like Uber and GetYourGuide are already building native experiences. During demos, journalists saw Uber showing pickup directions overlaid in the glasses' display, with a map visible when they looked down.
The SDK also enables something called System Autospatialization. Google says it arrives in 2026. The idea: AI that converts flat content into 3D on the fly. Stream a game from your PC, watch it gain depth. Pull up a YouTube video, same thing. Google's demo showed Cities: Skylines with the UI floating in front and the city receding behind it.
That's an aggressive claim. Real-time depth estimation on a mobile chip while running a full XR operating system would be genuinely impressive. Or it's marketing vapor. We'll know when it ships.
Google's original smart glasses became a cultural punchline. "Glassholes," people called the early adopters. Bars banned the devices. The $1,500 price ensured only the most committed enthusiasts would endure the social stigma.
The new glasses address the technical failures explicitly. Brin cited "less advanced AI" as a core problem with the original product. He's not wrong. Siri in 2013 could barely set a timer. Gemini can look at a photo of your fridge and suggest dinner recipes. It translates conversations while you're having them. The jump in capability is real, even if the marketing oversells it.
Price remains unknown. Google hasn't announced pricing for any variant. Meta's Ray-Ban collaboration starts at $299 for audio-only frames and reaches $799 for the display model. Warby Parker's SEC filing confirmed its Google partnership launches in 2026 but offered no numbers. The $150 million commitment Google made to Warby Parker in May suggests serious manufacturing scale, not a one-off collaboration.
Privacy gets a more visible treatment. Like Meta's Ray-Bans, the new glasses include indicator lights that illuminate when cameras or AI features activate. Gemini users can delete prompts and activity through the app.
People still do not want to be recorded. A blinking light doesn't fix that.
"Our belief here is that glasses can fail based on a lack of social acceptance," said Juston Payne, Google's director of product management for Android XR. He's acknowledging that technical capability isn't the bottleneck. Getting humans to tolerate cameras on other humans' faces is.
Meta leads this market today. Apple reportedly plans glasses without displays as its entry point. Timing unclear. Snap's next Spectacles arrive in 2026. Alibaba has entered the space in China.
Google wants the OS on every piece of glass you own. Audio-only glasses target Meta's Ray-Ban success directly. Display glasses aim at the segment Meta created with its $799 model in September. Wired XR glasses like Project Aura have no direct Meta equivalent, positioning Google to claim territory before competitors arrive.
The Galaxy XR headset, meanwhile, keeps evolving. Monday's updates added PC Connect, enabling wireless Windows desktop streaming with hand-tracking control. A new "Likeness" feature scans your face via smartphone to generate a photoreal avatar for video calls, matching Apple Vision Pro's Persona. Travel Mode stabilizes the view during flights and car rides.
Chi Xu, Xreal's CEO, framed Project Aura as "a stepping stone" toward fully wireless glasses. The processing puck represents current chip limitations, not design philosophy. "Once we really have a great experience with the puck, with glasses, the next question will be, wow, when can we replace the puck with our phone?"
Google is building the software for a future where the puck disappears. Until then, you're going to have to clip it to your belt.
For Meta: Google's play here is platform, not hardware. Get Android XR running on enough devices and developers start building for it first. Meta's been winning on execution, shipping Ray-Bans that people actually buy. But Meta's software stack is proprietary and fragmented. If Google pulls off the unified ecosystem play, Meta's hardware lead starts to erode. We'll know within 18 months.
For developers: The Android XR SDK creates a decision point. Building for Apple means waiting for hardware that may arrive in 2026 or later. Building for Meta means targeting a proprietary platform. Building for Android XR means potential reach across Samsung headsets, Xreal glasses, and upcoming consumer frames from Warby Parker and Gentle Monster. The platform that attracts developer attention first tends to keep it.
For consumers: Expect aggressive pricing on audio-only models as Google and Meta fight for the entry-level segment. Display glasses will command premium prices through 2026, dropping as manufacturing scales. The wired XR category offers the most capability today, but at prices and with tethering requirements that limit mainstream appeal.
Q: What is Raxium and why did Google buy it?
A: Raxium is a display startup Google acquired in 2022 for an undisclosed sum. The company specializes in Micro LED technology, which packs more brightness into smaller displays than OLED. This matters for glasses because outdoor visibility is the main technical challenge. Apple was building a headset at the time, so the deal may have been partly defensive.
Q: How does Project Aura's processing puck work?
A: The puck is a separate device containing the Snapdragon XR2+ Gen 2 chip, the same processor in Samsung's $1,800 Galaxy XR headset. It connects to the glasses via a permanent tether cable and clips to your belt or pocket. This setup moves the heavy computing off your face, keeping the glasses lighter while delivering full Android XR app support.
Q: What is Meta's Neural Band and how does it compare to Google's approach?
A: The Neural Band is a wristband Meta requires for its $799 Ray-Ban Display glasses. It reads electrical signals in your arm to detect hand gestures. Google skipped this, instead letting users control glasses through Android smartwatches with taps and swipes. Neither method has proven better yet. Both companies are experimenting.
Q: Will Google's AI glasses work with iPhone?
A: Yes, but with limits. Google confirmed iPhone compatibility through the Google app. However, deeper features require Android. On Android phones, Gemini hooks into the full operating system for richer integration. iPhone users get basic functionality. Google is prioritizing its own ecosystem while keeping the door open for Apple users.
Q: What is the 70-degree field of view on Project Aura, and is that good?
A: Field of view measures how much virtual content you see at once. Project Aura's 70 degrees is narrower than the Galaxy XR headset's 100 degrees but wider than any previous AR glasses from Xreal. For comparison, human peripheral vision spans roughly 180 degrees. The 70-degree view is a trade-off for the lighter, more portable form factor.



Get the 5-minute Silicon Valley AI briefing, every weekday morning — free.