Proceedings of the 5th Symposium on Spatial User Interaction 2017
DOI: 10.1145/3131277.3132180
|View full text |Cite
|
Sign up to set email alerts
|

Gaze + pinch interaction in virtual reality

Abstract: Modern VR/AR systems extend the natural hand-tracking UI with eye-based interaction Controllers, hand gestures, eye movements, and voice: many ways to click buttons in virtual reality environments. What about: glance at a UI object with your eyes, then simply pinch with your fingers to activate it. Apple innovates with the first wide adoption of this interaction style for their Vision Pro spatial computer. As well the Hololens 2 and Magic Leap offered similar functionalities. But Apple, renowned for stellar pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
66
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 199 publications
(81 citation statements)
references
References 42 publications
0
66
0
1
Order By: Relevance
“…A range of works have compared eye and head pointing showing that eye gaze is faster and less strenuous, while head pointing is often preferred as more stable, controlled and accurate [5,10,18,23,44]. As in 2D contexts, eye pointing can be combined with fast manual confirmation by click or hand gesture [41,46], or with dwell time or other specific eye movement for hands-free selection [20,31,42]. In contrast to the 2D desktop setting, gaze in VR inherently involves eye-head coordination due to the wider FOV.…”
Section: Gaze Interaction In 3d Environmentsmentioning
confidence: 99%
“…A range of works have compared eye and head pointing showing that eye gaze is faster and less strenuous, while head pointing is often preferred as more stable, controlled and accurate [5,10,18,23,44]. As in 2D contexts, eye pointing can be combined with fast manual confirmation by click or hand gesture [41,46], or with dwell time or other specific eye movement for hands-free selection [20,31,42]. In contrast to the 2D desktop setting, gaze in VR inherently involves eye-head coordination due to the wider FOV.…”
Section: Gaze Interaction In 3d Environmentsmentioning
confidence: 99%
“…Les dispositifs de réalité virtuelle disponibles sur le marché tels que l'Oculus Ri® ou l'HTC Vive® requièrent l'usage de contrôleurs. Cependant, ces derniers posent certaines contraintes d'utilisabilité [21] : 1) ils imposent à un utilisateur la manière de les tenir ; 2) ils nécessitent une phase d'apprentissage d'une liste de commandes plus ou moins complexes ; 3) ils reprennent les anciens paradigmes de l'informatique conçus pour des interfaces en 2 dimensions (pointeur de souris). La mise sur le marché de dispositifs de suivi du mouvement des mains tels que le Kinect®, l'Intel Real Sense®, ou encore le Leap Motion®, a permis d'ouvrir de nouvelles perspectives concernant les interactions homme-machine en VR [10].…”
Section: Contexteunclassified
“…Gaze raycast: Tanriverdi and Jacob [40] proposed one of the early work on using gaze interaction in VR applications. They outlined two main reasons of why the use of gaze as an interaction means seems compelling in VR environment: 1) the users' preexisting abilities to perform interaction with a VR interface in the absence of conventional WIMP-based commands (such as typing keywords or clicking by mouse), 2) eye gaze allows for targeting distant objects faster and with less physical effort than pointing with hand [34,38,42]. They confirmed the latter by conducting an experiment comparing the performance of manual and gaze input for interacting with close and distant objects in VR.…”
Section: Gaze Interaction In Vrmentioning
confidence: 99%