Design patterns for Immersive Tech

AR hand menu

User goals

  • Provide relevant menu options that do not obscure the users view
  • Provide a menu that is easy to access at all times
  • Allow the user to control the position of where the menu appears

Interaction

This is the AR version of Displaying menu on controllers. In this case the AR headset must be able to track a handheld controller or the user’s hands, as the menu options will be anchored to this object in order to achieve the level of control needed.

Today only Leap Motion is able to track the hands to the required level of accuracy.

  1. Show an visual indication that there is a hidden menu associated with the handheld device or the hands
  2. The user then performs an interaction to open the menu (button press on device, gesture, voice command or in the example below a direct interaction with the hand and the menu indicator)
  3. Once the menu is expanded, the user can use their fingers or potentially a pointing device like gaze or an AR ray-cast to highlight and select options.
  4. Note the Leap Motion hand based interaction (shown in the example) is similar to the hand interactionwieh arm menus in Facebook Spaces on Oculus Rift.

Linked to: AR movable menu panelDisplaying menu on controllers

Good

  • The menu can be triggered at any time. It’s anchored to the hand so cannot be misplaced or lost.
  • The menu position can be controlled by movingthex hand so it does not obstruct other items

Bad

The technology is not yet publicly available, however we can start to understand the benefits and principles for how to design for AR, how such menus would be used, what augmented features they would provide and how we ensure the digital UI is available when needed.

Examples

Leap Motion: Project North Star conceptual work

Leave a Reply