It’s been nearly a decade since Apple revealed its last major product, the Apple Watch.
Now, the company has a completely different kind of wearable, the mixed-reality headset called the Vision Pro, which the company revealed at its annual Worldwide Developers Conference on Monday.
I got to take one for a test drive.
Apple gave some WWDC attendees a controlled demo of Vision Pro, walking us through how to manipulate apps and other content in 3-D space and to train the internal cameras to track our eyes. Many features weren’t available for us to try, like Siri voice controls and the camera that will let you capture 3-D images and videos. The Vision Pro won’t go on sale until early 2024, Apple says, so it likely has a few more kinks to work out before the company lets the public get the total experience.
Still, my demo gave me a taste of where Apple fits in the burgeoning headset space.
But first we had to deal with my eyeglasses. The Vision Pro isn’t large enough to fit glasses if you wear them. The solution: a system of snap-in prescription lenses. An Apple representative took my glasses and placed them on a machine that could read my prescription. By the time I got to the demo room, a customized set of lenses were waiting for me inside the Vision Pro. (You don’t need to worry about this if you wear contacts.)
Then it was time to jump in.
The headset was comfortable, with a cushy fabric lining around the face and headband keeping it strapped to my head. But, like every other headset I’ve tested, it started to feel a bit heavy and uncomfortable by the end of my 30-minute demo.
When I first switched on the device, the external cameras fed the outside world to the sharp displays on the inside. It was crystal clear — almost shockingly so. While Meta‘s most advanced headset, the Quest Pro, feeds you blurry, pixelated images of the outside world, Apple Vision feels like you’re looking through glass, not at a screen.
Then it was time to dive into what the Apple Vision can do. Pressing the dial on the top right of the device, which Apple calls a Digital Crown like the one on the Apple Watch, brings up a menu of app icons. It’s sort of like pressing the home button on an old iPhone. All the standard Apple apps you’d expect to see were there: photos, iMessages, Apple TV, Safari and so on, floating in front of me.
To select an app (or anything else you want to “click”), you look at what you want and then make a pinching gesture with your thumb and index finger to select it. Cameras on the inside of Apple Vision track your eyes and recognize what you’re looking at. The external cameras track your hand movements. Meta’s headset has a similar feature, but it doesn’t work nearly as well as it does on Apple Vision, if it works at all. (Meta ships its headset with a wireless controller for better control.)
Opening an app in turn brings a window floating before you, and you can surround yourself with apps if you’d like, almost like working on multiple screens on a desktop computer. The apps look just as crisp as an app on an iPhone or MacBook. That’s significant — until now, I’ve never used a headset with visuals that clear.
I went through several demos like browsing through a library of images in photos, including panoramic ones that made it feel like I was inside the scene. Apple TV is another key app — you can place a virtual movie screen anywhere in the room. I watched a 3-D clip of the latest “Avatar” movie, and it was just as clear as watching on my 4K TV at home.
You can also rotate the Digital Crown clockwise to bring the headset into full virtual reality and place yourself in an immersive environment, like a starry night in the wilderness. I especially liked that for watching a movie — it felt like I was sitting in my own personalized IMAX theater.
But VR doesn’t take you fully out of the real world. If someone is in the room with you and you look at them for a few seconds, the headset slowly fades them into view within your immersive environment.
The other demo worth mentioning: FaceTime.
An Apple employee wearing her own Vision Pro in a separate room gave me a call and she popped up in a window hovering in front of me. But it wasn’t her actual face — it was a realistic-looking avatar Apple calls a “persona.” Apple’s demo didn’t let me scan my face with Vision Pro to make my own persona, but the one belonging to the woman I was talking to looked enough like her that it tricked me into thinking it was a regular video chat at first. It’s a far cry from Mark Zuckerberg’s cartoonish “metaverse selfie” that went viral last year.
Even though most of the focus with Apple Vision is on the visuals, I was equally impressed by the audio. The Vision Pro has a pair of speakers that sit near your temples and provide a surround sound effect. If you’ve ever used the spatial audio feature with AirPods, you’ll know what I mean. But the effect is much more pronounced in AR and VR, and it gave me a better sense of presence and immersion than the visuals alone. And even though the sound wasn’t pumping directly into my ears, other people in the room couldn’t hear it. (You can still pair AirPods with the Vision Pro, of course.)
Finally, there’s the price. The crowd at WWDC Monday — which was packed with some of Apple’s biggest fans — audibly groaned when the $3,500 price tag popped up on screen. But as someone who has tried just about every mainstream headset to date so far, I can tell you that Vision Pro feels like a $3,500 machine. It’s that much more advanced than its next closest competitor, Meta’s Quest Pro.
That’s the state of this technology today: a mediocre-to-bad experience for several hundred dollars, or a premium and visually satisfying experience for thousands.
That alone should tell you the Vision Pro and other devices like it have a long way to go to move beyond a niche product.
Read the full article here