User goal / Problem this is trying to solve
- Allow the user to easily select something from a hand menu (or menu attached to controller)
- Use both hands to make interaction with contextual menus efficient
- Simulate direct manipulation using hands and fingers
This is the ‘virtual hands‘ version of the Ray Casting & Hand Menu pattern. It achieves the same goals but is used when the application used a visualisation of the hands rather and selection with the finger, instead of selection with a ray cast pointer.
- The hand menu appears on the left (or less dominant hand), triggered by a user action
- The hand menu can be triggered by picking up an object with that hand. Whichever hand picks up the object has the hand menu attached to it. This is a quick and intuitive way to show a set of contextual options or settings for the virtual object that was picked up.
- It can also be triggered by a a button on the controller or other mechanism such as interacting with a VR wearable on the hand or wrist
- The other hand would be used to interact with the options displayed on the hand menu. Typically, if the finger is the selection device, the user would need to make contact between the finger and the buttons on the menu.
- The hand menu can be closed either by pressing the contextual menu toggle command on the controller, tapping a virtual close button on the menu itself with the finger or by dropping the objet held in that hand.
- Intuitive as all controls are mapped to the user’s two hands (also making use of both hands is a plus)
- The finger works better than a ray cast for such short distances
- Picking up and dropping an object are an intuitive and contextual way to trigger the hand menu, and an intuitive way to discover the options associated with a specific virtual object.
- Arm menus generally have a number of benefits
- Only works on systems that have 6DOF on the controllers
- Only works on systems that track the hands (currently limited to Oculus or Leap Motion)
Facebook Spaces (Oculus Rift)
Contextual arm menu appears when object is picked up. Only relevant commands are shown.
Contextual menu can be used to launch augmented tools to carry out operations on the held object
The arm menu is also a useful mechanism for displaying notifications (an audio and haptic cue is also recommended to draw the users attention)