I’ll be honest with you: I think smart glasses are awesome. I don’t mean to say they work well all the time, or that they don’t do half of what they should do at this stage, or that they don’t need a massive dose of computing power. Those things are all true in my honest opinion. But even despite having to scream at my pair of Meta Ray-Bans to play Elton John this morning, only for them to repeatedly try and play John Prine instead, I love them for one thing: their potential.
And Google—if I/O 2025 is any indication—sees that same potential, too. It spent quite some time during its keynote laying out its vision for what smart glasses might be able to do via its Android XR smart glasses platform and supporting hardware made with Xreal. The possibilities are enticing. With Android XR and smart eyewear like Project Aura glasses, the form factor could actually get to the next level, especially with optical passthrough that can superimpose stuff like turn-by-turn navigation in front of your eyeballs.

If you were watching the live demo yesterday, it probably looked like we were all right on the cusp of being able to walk around with the futuristic pair of smart glasses we’ve been waiting for, and partly that impression is true. Smart glasses—ones that do all the fun stuff we want them to do—are imminent, but unfortunately for everyone, Google included, they’re likely going to be a lot harder to perfect than tech companies would have us believe. When it comes to glasses, there are still constraints, and quite a few of them.
For one, if smart glasses are going to be the future gadget we want them to be, they need some kind of passthrough. Technologically speaking, we’re already there. While Gizmodo’s Senior Editor, Consumer Tech, Ray Wong, only got about a minute to try Project Aura at Google I/O he can attest to the fact that the glasses do indeed have an optical display that can show maps and other digital information. The problem isn’t the screen, though; it’s what having a screen might entail.
I’m personally curious about what having optical passthrough might do to battery life. Size is still the biggest issue when it comes to functional smart glasses, which is to say that it’s difficult to cram all of that architecture we need into frames that don’t feel much heavier than a regular pair of glasses. Think about it: you need a battery, compute power, drivers for speakers, etc. All of those things are relatively small nowadays, but they add up. And if your glasses can suddenly do a lot more than they used to, you’re going to need a battery that reflects those features, especially if you’re using optical passthrough, playing audio, and querying your onboard voice assistant all at once. A bigger battery means more weight and a bulkier look, however, and I don’t really care to walk around with Buddy Holly-lookin’ frames on my face all day.
I was promised 5 minutes with the Google Android XR smart glasses, but they only gave me 3 minutes, and half of that was explaining what they were and how they worked, so I actually only had 90 seconds to use Gemini on a painting on a wall, two books on a bookshelf, and the… pic.twitter.com/Ly60boX91G
— Ray Wong (@raywongy) May 20, 2025
Then there’s the price. Meta’s Ray-Ban smart glasses are already relatively expensive ($300+, depending on the model) despite not even having a screen in them. I don’t have the projected bill of materials for Project Aura or Google’s prototype Android XR glasses, obviously, but I am going to assume that these types of optical displays are going to cost a little bit more than your typical pair of sunglasses. That’s compounded by the fact that the supply chain and manufacturing infrastructure around making those types of displays likely isn’t very robust at this point, given people really only started buying smart glasses in the mainstream about five minutes ago.
I sound like I’m naysaying here, and maybe I am a little bit. Shrinking tech down, historically speaking, has never been easy, but these are some of the most powerful and resourced companies in the world making this stuff, and I don’t for a second doubt that they can’t figure it out. It’s not the acumen that would prevent them; it’s the commitment. I want to believe that the level of focus is there, given all of the time Google devoted to showcasing Project Aura yesterday, but it’s hard to say for sure. Google, despite the behemoth it is, is being pulled in various directions—for example, AI, AI, and AI. I can only hope in this case that Google actually puts its money where its futuristic spy glasses are and actually powers Project Aura through to fruition. C’mon, Google, daddy needs a HUD-enabled map for biking around New York City.