Apple’s bulky, powerful, occasionally beautiful, and way-too-expensive XR headset, the Vision Pro, is coming into the fold of Apple’s software ecosystem as well as the wider VR market. At WWDC25, Apple shared more info about this first step to a more mature “spatial” ecosystem with visionOS 26. The update should allow for easier controls with your eyes and—for the first time—actual controllers you can port from other headsets, like Sony’s PlayStation VR2.
While the rest of the Apple ecosystem is changing its look to match what was already on the Vision Pro, the look inside Apple’s headset isn’t changing much, despite the growth spurt from visionOS 2 to visionOS 26. The first big improvement is the introduction of eye-scrolling. It means users no longer have to pinch and drag to look through a web page or PDF. Apple’s “spatial computer” should instead use the headset’s eye-tracking to help you jump to where you want to be on the page. There are additional all-new widgets designed specifically for the Vision Pro. These are designed to be placed against a wall within AR space. They include a subtle 3D effect to make each widget appear like it was set into a wall. Another widget can act like a fake window to look out at a panorama photo you’ve taken with a phone. Apple is opening up its OS to support more 180- and 360-degree footage from companies like Insta360 and GoPro, which means you may have more access to 3D content than what Apple’s willing to share with users.

One of the headlining features for the Apple Vision Pro was “Personas,” which were supposed to act as lip-synced 3D avatars for users talking over FaceTime or other supported apps. At launch, these had a waxy, dead-eyed appearance that was equal parts intriguing and off-putting. The new update could finally offer a more lifelike appearance, with more texture on users’ hair and eyes.
Currently, the Apple Vision Pro hand-tracking recognizes several gestures for navigating through apps. Most important to daily use has been the pinch, though a bare few apps could also recognize the orientation of your fists as if you were holding onto an invisible steering wheel. This isn’t anywhere close to enough for most VR games. Finally, the AVP will accept third-party controllers. First on the list are the Logitech pen for mixed reality art apps and the PlayStation VR2 Sense controllers. This should make it easier to use when you need pinpoint controls, like in drawing apps.

Apple suggested this “new class of games” available to AVP will include such heavy hitters as the pickleball simulator Pickle Pro. The introduction of third-party peripheral support could be a big deal—the headset tracks six degrees of freedom (DoF), and with vibration support, it could offer one of the more immersive VR experiences available. It also means we may finally get to experience more ports of other VR games. We can already imagine how nice it would be to get Steam Link working on a Vision Pro to play Half Life: Alyx on the relatively wide field of view with the twin 4K micro-OLED displays.
We shouldn’t feel too disappointed Apple didn’t craft its own first-party controllers; PlayStation’s VR2 Sense controllers are a solid option. The Apple Vision Pro sometimes feels like the red-headed stepchild of the Cupertino tech giant’s larger brand. It’s been around for more than a year, and it has improved significantly in the intervening months with every new update for visionOS 2 and onward. The latest updates to guest accounts made it a better device to share with people nearby. Plus, turning your Mac screen into an ultrawide monitor in AR space is both cool and surprisingly useful. What’s missing is pure content. Apple has produced numerous short- and long-form content viewable exclusively on the AVP, including a full-length biopic about Bono and small movies like Submerged. But for every bit of passive content that arrives on the platform, there has been a dearth of active content we mostly associate with VR and AR—especially gaming.