One of the most striking design principles of the Apple Vision Pro is its absence of “stuff attached to your hands”. The unexpected lack of visible interfaces on your palms or wrists significantly enhances the daily experience, contrasting sharply with what’s typical on other platforms.
Why is this important?
When the Quest (Meta’s competing VR headset) was initially launched in 2019, it lacked hand tracking, necessitating the use of controllers with a built-in menu button.
Later, with the introduction of hand tracking, Meta relocated these buttons to a small floating menu that appears when you glance at your right hand. However, as you use the device and the small menu intrudes across experiences, it has significant implications:
- For immersive experiences, this menu is a disruptive element. Its distinct micro-aesthetic, icons, and transitions clash with the designed environment, shattering immersion.
- In interactive experiences, this menu monopolises a key opportunity for interactions. Placing other stuff there becomes a hot mess.
- For first-time users, particularly at events like fairs or in museum settings, there’s a risk of accidental exits from the application due to unintended interactions with this menu.
- Most crucially, frequent users subconsciously adapt their behaviour to avoid the ‘illegal’ space on their right hand, much like tuning out a persistent annoyance. This adjustment detract from the natural engagement across all experiences.
Apple Vision Pro’s design principle
The Apple Vision Pro introduces a radical design principle: Nothing.
No intruding menus, and the primary interactions are look, pinch, and drag. For me, it is a revelation to experience. This might sound dramatic, but when you are accustomed to the Quest menu following you like a little fly, I found it refreshing to see nothing when I looked at my hands.
It’s best described as a significant release of tension; a part of my brain can finally relax.
By extension, this underscores the impact of ecosystems and how profoundly small design decisions shape our experiences. Every time you solve something with more UI, you add cognitive load. Less really is more!
Where did the menu go?
I lied before. There still is a menu. If you’ve used the Apple Vision Pro, you know if you gaze up towards your eyebrows, a small arrow appears. Pinching while looking at it expands a menu, reminiscent of the iPhone’s control center.
So why do I think this differ from the menu on Meta’s Quest Horizon platform?
My observation is that many VR experiences naturally involve your hands, creating a ‘hot zone’ of activity around them. In contrast, the space between your eyebrows usually remains unutilised—a ‘natural dead zone.’ Minimal global UI here is both unobtrusive and intuitively accessible.