Orbits: Gaze Interaction for Smart Watches
Category: Work

Esteves, A., Velloso, E., Bulling, A., and Gellersen, H. 2015. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST ’15). ACM, New York, NY, USA, 457-466. [Best paper award] [download]

Esteves, A., Velloso, E., Bulling, A., and Gellersen, H. 2015. Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets. In the Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (UbiComp/ISWC’15 Adjunct). ACM, New York, NY, USA, 419-422. [download]

Orbits is a novel technique that enables gaze-only input in a design that accounts for both the limited display space of smart watches and the spontaneous nature of glancing at a watch. Inspired by mechanical watches with several moving targets in the form of the watch’s hands or the small dials and cogs in timers and chronometers, Orbits relies on interface controls that contain targets that move continuously in circular trajectories. Each target performs a distinct function and can be activated by following it with the eyes for a certain amount of time. They can be used for both discrete control (by treating each Orbits activation as a command) and continuous control (by using the time following the target to modify the value of the controlled parameter). Each Orbits widget comprises a trajectory, one or multiple targets, and feedback elements.

With a speed of 120°/sec, up to eight moving targets can be reliably detected with an average accuracy of 83% and zero false activations. Orbits has also a low false positive rate, triggering a false selection in just 2.1% of trials when users simply read the time on the watch. Two studies demonstrated that Orbits does not depend on any particular tracking technology. In fact, both the studies used affordable, consumer-grade eye trackers, both remote and head-mounted. Encouraged by these results, three example applications were developed to help illustrate Orbits in practice. They were deployed on a Callisto 300 smart watch and tracked gaze input with a Pupil Pro head-mounted eye tracker.

The first example application was a music player. The interface consisted of five different Orbits that allowed users to perform several discrete actions such as play/pause (one 1.6cm target), skip to the previous or next song (two 1cm targets with opposing directions) and adjust the playback volume (two 1cm targets with opposing directions). All targets shared the same angular speed of 180°/sec. The goal of this interface was to provide fast and hands-free access to music content, enabling interaction in previously challenging (e.g., biking to work) or cumbersome scenarios (e.g. using both hands to type a document).

The second example was a notification panel that presented six colored targets on a watch’s face. Each target and individual color represented an application (e.g., Facebook, Snapchat) and the size of their trajectory represented the number of unaddressed notifications (the bigger the diameter the more notifications it represented). The trajectories of these targets ranged from 2.6 to 0.6cm and all shared an angular speed of 180°/sec. The goal of the application was to highlight some of the unique qualities of Orbits interfaces. This included how the selection area of these targets was no more than 0.1cm in size (making it very challenging to acquire with touch input) and how it would expand to represent additional information such as the number of notifications and the logo of the application when users would follow it with their eyes (effectively managing the limited screen space by presenting information only on a need-to-know basis).

The third and last example application provided a quick access menu to a contextual event: a missed call. The interface consisted of a 2.6cm main Orbits that informed users of the event, and upon acquisition would display four other controls of 1cm diameter. These four, smaller Orbits allowed users to call-back, reply-to, store the number or clear the event. All these shared targets with the same angular speed of 180°/sec. Finally, these four Orbits would disappear after four seconds of inactivity. The goal of this interface was to enable users to inconspicuously address common communication events that can occur in sensible or inappropriate situations (e.g., meetings).

The main motivation of this work was to enable hands-free interaction on smart watches, but it is easy to foresee the combination of Orbits with other modalities. For example, while manual or other hand-based techniques could remain as the primary input modality due to their input speed, Orbits could be added as a complementary modality when the hands are otherwise engaged (e.g., cooking). Additionally, Orbits can be used in devices other than smart watches, particularly where hand interaction is difficult (e.g., using a smartphone while on a treadmill) or impossible (e.g., assistive interfaces).

Screenshot from three demo applications for Orbits

Screenshot from three demo applications for Orbits

My Facebook My Linkedin My Pinterest My E-mail My RSS