All the usual suspects of big tech are engaged in a not-so-secret arms race to be the first to develop mass-market augmented reality glasses. We already know Google teamed up with Samsung to develop an Android-centric pair of glasses, but the company recently showed off another device to a small audience at the TED2025 conference, and already we can tell its the kind of XR—AKA “extended reality”—that may take help us escape the tyranny of overlarge VR headsets.
Shahram Izadi, the VP and general manager of XR at Google strode out on TED’s stage wearing what initially appeared to be a typical—though overly-large—pair of glasses (think Meta’s huge and also unreleased Orion smart glasses). But they were Google’s XR glasses, and Izadi claimed they were displaying his speech notes to him as he talked.
The glasses had a microphone, camera, and speakers to gather as much information. Google’s first XR device since Google Glass has an “in-lens display,” which Izadi held up to the camera for a scant few seconds, remarking “it’s very, very small.” That could point to the idea Google is experimenting with Waveguide-type glasses displays, featured in devices like the latest RayNeo glasses. The glasses themselves use Android XR, Google’s homegrown OS for extended reality devices.

Google product manager Nishtha Bhatia showed the crowd what the glasses cameras could display through the built-in camera. She tapped one arm of the glasses, which launched the typical blue star Gemini logo at the bottom of her display, and after a half second the AI-voiced chatbot appeared ready to offer an eyeroll-worthy haiku about the gathered audience with their “faces all aglow.” Gemini can translate text it sees into different languages, though Izadi suggested that feature may produce mixed results. That same camera could also parse text and graphs and turn it into more-digestible soundbites.
Izadi and Bhatia also showed off the “memory” feature that let the AI recall things it saw through the camera in the recent past. It’s akin to what Google Deepmind demoed with Project Astra last year. Google has slowly been adding Astra features, including photo and video recognition, to the Gemini Live chatbot interface, and it seems the company is looking to integrate similar features on an upcoming pair of AR glasses.
The glasses should have more capabilities to connect with your smartphone and “access all your phone apps.” Yet the real killer feature is the connection with your other Google apps. Bhatia asked the glasses to look at a record from rapper Teddy Swims and play a track. The glasses opened up YouTube Music and played the requested song. We already enjoy the Ray-Ban Meta glasses well enough for their solid speakers without needing a pair of earbuds, so this is a no-brainer from Google. What was even wilder was that the display inside the glasses could work with Google Maps, offering a semi-holographic image of Google streetview for navigation.
Google has been working behind the scenes on AR glasses for several years, even after the demise of Google Glass. Curiously, the pair shown at TED may not be sold under Google’s own banner. The search giant is working hand in hand with Samsung on its Project Moohan smart glasses by enabling the device with Android XR. Google showed off the Samsung’s prototype headset on the TED2025 stage. From what we’ve seen of Moohan, which may be a premium-priced device, the headset will act much like today’s Apple Vision Pro, but with more Gemini-enabled features.
Samsung has also strongly hinted it’s working on a separate pair of smart glasses, but we doubt what Google showed off this week is what’s rumored to arrive later this year. Korean publication ETNews (read with machine translation) reported last month the upcoming device may not have a display or buttons. It would instead rely on a microphone for speech and a camera for gesture controls.
Meanwhile Meta may also be working on an expensive pair of glasses with a tiny screen dedicated to an app shelf on the bottom of the right-hand lens. The display is what will truly set “augmented reality” glasses apart from the audio focused glasses we’ve see so far. To stand apart Google will need to figure out how to use a bigger display, while balancing battery life and the weight of the glasses. But if it does, we may finally get our AR kicks in without needing to shove a heavy headset over our eyes.