Highlighting items with ‘Gaze control’

User goal or problem this is trying to solve

  • Indicating item of interest where there is no handheld pointer device
  • Highlighting an item from the menu
  • Exploring to see which items in the 3D space are interactive, without commiting to selection

Interaction

The user uses the focal point of their gaze to point at items they want to select or interact with. This requires the user to move their head around the virtual space and point their gaze directly at the item of interest.

Often, an on screen pointer or cross-hair indicates to the user where the head mounted display is pointing. This is important as the headset is not tracking the eyes, only the head position, so often the user is not looking in the same place as the gaze pointer.

When the gaze pointer encounters a selectable or interactive item either the pointer or the item exhibits visual feedback that an interaction is possible. For example the option appears highlighted, zooms to a bigger size or displays some other colour change or animation to draw the user’s attention.

  1. The user moves the gaze pointer in the centre of their visual field by moving their head
  2. When the gaze pointer points directly at an interactive object, the object shows a visual change or animation to indicate an interactive affordance (note: this interactive behaviour will be described in more detail in a separate future pattern)
  3. Often a physical input on a handheld controller needs to be activated to select the item or trigger the interaction, to decouple the the ‘highlight’ and ‘select’ interactions. Otherwise the user could accidentally activate many objects as they simply look around to explore what’s available, which would take away the user’s feeling of control
  4. A variation is where a prolonged fixation is used to trigger a selection. This is the eye-gaze equivalent of a long press, and will be covered in a separate interaction pattern.

Good

  • Allows the user to explore and highlight any object in the 3D space simply by moving the pointer with their head
  • Works across all head mounted display platforms, making it a simple, generic navigation method that is always available, and relatively easy for developers to implement.
  • Prevents the user from accidentally selecting objects just by looking at them, by decoupling selection

Bad

  • It is often physically and cognitively fatiguing as the user needs to make frequent and at times erratic head movements to navigate
  • Often, the pointer is too small or subtle and users lose sight of it, not realising they are not looking dead centre, and become frustrated they are not highlighting or selecting the desired object.
  • Our hands offer very precise movements as we’ve evolved to have extremely fine manual motor control, and the same is true for eye movements. However, we have lower levels of dexterity with head movements as we rarely need adjust its position with the same level of precision. This interaction is often fiddly when fine movements are required for this reason.
  • UIs and menus need to be designed with the principle of minimising both unnecessary movement and user fatigue. For example sequential actions on a menu should require only small incremental head movements, optimising Fitts law, rather than jerky back and forth from one side of the visual field to the other. For similar reasons, targets should be sufficiently large and the gaze pointer should be prominent and clear.

Examples

Sky VR, Gear VR

Rush of Blood, Playstation VR