With eye-tracking increasingly available in Augmented Reality, we explore how gaze can be used to assist freehand gestural text entry. Here the eyes are often coordinated with manual input across the spatial positions of the keys. Inspired by this, we investigate gaze-assisted selection-based text entry through the concept of spatial alignment of both modalities. Users can enter text by aligning both gaze and manual pointer at each key, as a novel alternative to existing dwell-time or explicit manual triggers. We present a text entry user study comparing two of such alignment techniques to a gaze-only and a manual-only baseline. The results show that one alignment technique reduces physical finger movement by more than half compared to standard in-air finger typing, and is faster and exhibits less perceived eye fatigue than an eyes-only dwell-time technique. We discuss trade-offs between uni and multimodal text entry techniques, pointing to novel ways to integrate eye movements to facilitate virtual text entry.
Figure 1: We present Vergence Matching, an interaction technique which uses the principle of motion correlation for selection of small targets in 3D environments. To select a target, smooth depth changes are induced perpendicular to the user: (a) when the target moves closer, the eyes move inwards increasing the vergence angle (convergence), (b) vice versa the vergence angle decreases (divergence) when the target moves away from the user. The relative vergence movement of the eyes are then correlated with the depth changes of the object to determine which target the user is attending to.
Gaze-Hand Alignment has recently been proposed for multimodal selection in 3D. The technique takes advantage of gaze for target pre-selection, as it naturally precedes manual input. Selection is then completed when manual input aligns with gaze on the target, without need for an additional click method. In this work we evaluate two alignment techniques, Gaze&Finger and Gaze&Handray, combining gaze with image plane pointing versus raycasting, in comparison with hands-only baselines and Gaze&Pinch as established multimodal technique. We used Fitts' Law study design with targets presented at diferent depths in the visual scene, to assess efect of parallax on performance. The alignment techniques outperformed their respective hands-only baselines. Gaze&Finger is efcient when targets are close to the image plane but less performant with increasing target depth due to parallax.
CCS CONCEPTS• Human-centered computing → Mixed / augmented reality; Pointing; Interaction design theory, concepts and paradigms.
As eye tracking in augmented and virtual reality (AR/VR) becomes established, it will be used by broader demographics, increasing the likelihood of tracking errors. Therefore, it is important when designing eye tracking applications or interaction techniques to test them at different signal quality levels to ensure they function for as many people as possible. We present GE-Simulator, a novel open-source Unity toolkit that allows the simulation of accuracy, precision, and data loss errors during real-time usage by adding gaze vector errors into the gaze vector from the head-mounted AR/VR eye tracker. The tool is customisable without having to change the source code and changes in eye tracking errors during and in-between usage. Our toolkit allows designers to prototype new applications at different levels of eye tracking in the early phases of design and can be used to evaluate techniques with users at varying signal quality levels.
CCS CONCEPTS• Human-centered computing → Human computer interaction (HCI); Mixed / augmented reality; Virtual reality; User interface toolkits.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.