Modern VR/AR systems extend the natural hand-tracking UI with eye-based interaction Controllers, hand gestures, eye movements, and voice: many ways to click buttons in virtual reality environments. What about: glance at a UI object with your eyes, then simply pinch with your fingers to activate it. Apple innovates with the first wide adoption of this interaction style for their Vision Pro spatial computer. As well the Hololens 2 and Magic Leap offered similar functionalities. But Apple, renowned for stellar product design, may nail it. Early users are raving about the mind-blowing and telepathic technology.To shed light on the interaction design, we present 5 design principles and 5 design issues. These are based on human-computer interaction research, mostly the paper "Gaze + Pinch Interaction in Virtual Reality" presented at the 2017 Spatial User Interfaces symposium. We'll see how much Apple has considered the scientific roots when we get our hands on it! Design Principles Division of labor: The eyes select, the hands manipulateOur eyes' natural role involves indicating points of interest, and we can easily look at any point at will. In contrast, the hands are adept at physical manipulation through the interplay of finger movement and hand posture. Use a clear separation of
Gaze information provides indication of users focus which complements remote collaboration tasks, as distant users can see their partner's focus. In this paper, we apply gaze for co-located collaboration, where users' gaze locations are presented on the same display, to help collaboration between partners. We integrated various types of gaze indicators on the user interface of a collaborative search system, and we conducted two user studies to understand how gaze enhances coordination and communication between co-located users. Our results show that gaze indeed enhances co-located collaboration, but with a trade-off between visibility of gaze indicators and user distraction. Users acknowledged that seeing gaze indicators eases communication, because it let them be aware of their partner's interests and attention. However, users can be reluctant to share their gaze information due to trust and privacy, as gaze potentially divulges their interests.
No abstract
Eye gaze is a compelling interaction modality but requires user calibration before interaction can commence. State of the art procedures require the user to fixate on a succession of calibration markers, a task that is often experienced as difficult and tedious. We present pursuit calibration, a novel approach that, unlike existing methods, is able to detect the user's attention to a calibration target. This is achieved by using moving targets, and correlation of eye movement and target trajectory, implicitly exploiting smooth pursuit eye movement. Data for calibration is then only sampled when the user is attending to the target. Because of its ability to detect user attention, pursuit calibration can be performed implicitly, which enables more flexible designs of the calibration task. We demonstrate this in application examples and user studies, and show that pursuit calibration is tolerant to interruption, can blend naturally with applications and is able to calibrate users without their awareness.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.