Today’s announcement by Apple about the entertainment aspects of the Vision Pro was followed up by new hands-on stories from Engadget and The Verge. A lot of what they saw was similar to the WWDC demos, but there were some new highlights, including additional Environments, a beta of the Disney+ app, Apple’s Encounter Dinosaurs app, and the Vision Pro’s floating keyboard.
One of the big open questions about the Apple Vision Pro is how well its virtual keyboard works. Interestingly, Engadget’s Cherlynn Low and Dana Wollman had very different experiences with it:
Cherlynn: It’s not as easy as typing on an actual keyboard would be, but I was quite tickled by the fact that it worked. Kudos to Apple’s eye- and hand-tracking systems, because they were able to detect what I was looking at or aiming for most of the time. My main issue with the keyboard was that it felt a little too far away and I needed to stretch if I wanted to press the buttons myself….
Dana: This was one of the more frustrating aspects of the demo for me. Although there were several typing options – hunting and pecking with your fingers, using eye control to select keys, or just using Siri – none of them felt adequate for anything resembling extended use. It took several tries for me to even spell Engadget correctly in the Safari demo.
Engadget’s editors were also impressed with the Disney+ Avengers and Star Wars-themed environments.
The Verge’s Victoria Song and Editor-in-Chief Nilay Patel also spent some time with the Apple Vision Pro. According to Song’s story:
Nilay had shot some spatial videos where he’d intentionally moved the camera to follow his kid around the zoo and felt some familiar VR motion queasiness. Apple says it’s doing everything it can to reduce that, but it’s clear some shots will work better in spatial than others — like any other camera system, really.
Song describes the experience of seeing EyeSight demoed, too:
So we got to see a demo of EyeSight — what an onlooker would see on that front display when looking at someone wearing the Vision Pro. It’s a bit goofy, but you can see the wearer’s eyes, part of what Apple calls a “persona.” (We were not able to set up our own personas, sadly.) When Apple’s Vision Pro demo person blinked, we saw a virtual version of their eyes blink. When they were looking at an app, a bluish light appeared to indicate their attention was elsewhere. And when they went into a full virtual environment, the screen turned into an opaque shimmer. If you started talking to them while they were watching a movie, their virtual ghost eyes would appear before you. And when they took a spatial photo, you’d see the screen flash like a shutter.
What’s clear is that it’s one thing to read about these experiences with the Vision Pro and a completely different thing to live them. After reading several accounts, I still don’t know what to expect myself, except in the broadest sense. That’s both a little frustrating but also very exciting.