top of page

Notes on the Apple Vision Pro

June 2023

This piece was originally written and translated into Korean as a part of LG Technology Ventures' Monthly Newsletter to business units and strategic partners. It is republished here with permission. Sensitive information has been removed.

Last Week, Apple released the long-awaited and long-rumored Apple Vision Pro headset. Details remain sparse on it, and we do not have exact specs yet either. However, some professional VR-focused reviewers and testers managed to get their hands on the device for testing, and I will be summarizing their findings along with our own analysis and conclusions.

 

Apple Vision Pro is built primarily as an AR pass-through device, rather than as a VR-first headset. In this, their spatial pass-through is the best that anyone has seen to date, blowing away the AR pass-through system on the Meta Pro. It also has very good hand tracking, with reviewers claiming that it “felt like their real hands” and they only saw a slight vignette around their hands when operating in VR mode. The thing that most impressed reviewers about the hand tracking was how well it would track their hands when laying on their lap or otherwise not in front of their face (as most reviewers had been used to the Oculus hand tracking which required you to have your hands directly in front of you at all times). They noted that the finger-tapping gesture paired with eye tracking worked so well that they forgot that they were even doing it. All testers did note that the finger-tapping detection failed once or twice and that its failing broke the deep immersion that they were previously experiencing.

 

Screen quality was generally considered good, with decent, but not amazing FoV (field of view) and no visible “screen door effect”. Reviewers did note though that although pixels per degree is approaching the point of being good enough to read text, the Apple employees who were running the extremely strict, extremely scripted demo were careful not to include anything that showed text, so reviewers were unable to make a judgment call on their own. Some did remark though that although the experience of using large floating windows is superior to working on a phone or MacBook Air, it pales in comparison to using any mid- to high-quality external monitor, which likely anyone buying at this price point would already have. Therefore the largest benefit of the vision pro for work is its mobility. They noted, however, that it seemed like more of a content consumption device than a content creation device, when used without a keyboard and trackpad/mouse (announced as supported accessories, but not shown at the demo).

 

Testers commented that the movie-watching experience (the 3D version of Avatar) was stunning, since the stereo of the headset made a huge difference in enjoying the 3D content, and it really showcased the quality of the display. However, they said that the videoconferencing features using the generated virtual humans were not impressive, noting that the avatars were firmly in the creepy part of the “uncanny valley”. They noted that there didn’t seem to be any compelling 3D / spatial content for this system (aside from 3D movies) and that almost everything that you could do was just tileable 2D apps in a virtual space.

 

There are two very interesting implications of this product’s inclusion and notable absence of features. The first is the lack of support for OpenXR (and Apple’s open hostility to the standard). This means that not only is Vision Pro launching without ANY of VR’s killer apps (most notably, BeatSaber). This makes it multiple orders of magnitude harder for VR developers to port their games and apps to Vision Pro. It is likely that the vast majority of launch content for the Vision Pro will be 2D iOS apps that are ported to run in a floating window, possibly with some minor gimmicky 3D features.

 

Some (including Apple) argue that interacting with a virtual world with your hands alone, rather than a controller, is the most “natural” way of working. These people ignore the fact that literally every single piece of art, writing, or engineering ever made by humanity has been made not with bare hands alone, but by people wielding tools (pens, chisels, keyboards, etc). Just as typing on a full-size keyboard is much faster than typing on a touchscreen, navigating VR interfaces and environments through controllers is much better than with hands alone (this is one reason why the majority of Oculus Quest interactions happen via the controllers, rather than with its built-in hand tracking). This lack of controller support also means that years of development in VR UI need to be thrown away with the Vision Pro. Simple actions like selecting a different brush or tool, opening a menu, or muting or unmuting yourself will require complex hand motions and gestures rather than simply pressing a button. Developers will need to completely rethink how people interact with their virtual environments for a hands-only interface.

 

The lack of teleportation is also a major challenge facing the Vision Pro. The vast majority of VR experiences take place in a large virtual environment. Because these environments (large virtual worlds) are often larger than the user’s own physical space, users must navigate the world not only by walking through it, but also through other modes of locomotion such as teleportation. Vision Pro does not have any interface that allows for teleportation or any other form of locomotion. Without this, Vision Pro users will not be able to interact with other users of the “metaverse” or any virtual world. In fact, it may not be possible for any two users to interact virtually in the same space together.

Testing Haptx haptic gloves

Haptics are another issue with the Vision Pro. Apple’s finger tap gesture allows users to generate haptics from their own fingers as they touch, and while that may be sufficient for basic window navigation, it is woefully inadequate for any kind of game or immersive VR experience. Grabbing objects in a virtual world without haptic feedback is a terrible experience, and even haptic glove solutions, such as HaptX (see photo of yours truly testing) are clunky and not accurate enough for precise or subtle interactions (like pulling a trigger or flipping a small switch). In my own testing, it was difficult to even pick up a virtual coffee cup. Apple’s controller-free approach may require more radical sources of haptic feedback. Open-air ultrasound haptics technologies, such as that developed by Emerge may be key to Apple developing a usable UX beyond floating 2D windows. Although Emerge is a niche and unnecessary technology for most VR systems, it (and similar systems) may be a necessary tool in Apple’s controller-less ecosystem.

 

Ultimately, Apple’s Vision Pro is a completely unique device. It shares some heritage with VR and AR systems, but will require building out an ecosystem completely from scratch, if Apple aspires for it to be anything other than a niche portable monitor replacement.

bottom of page