Proceedings of the 11th ACM Symposium on Eye Tracking Research &Amp; Applications 2019
DOI: 10.1145/3317956.3318150
|View full text |Cite
|
Sign up to set email alerts
|

Pointing by gaze, head, and foot in a head-mounted display

Abstract:  Users may download and print one copy of any publication from the public portal for the purpose of private study or research.  You may not further distribute the material or use it for any profit-making activity or commercial gain  You may freely distribute the URL identifying the publication in the public portal If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 29 publications
(16 citation statements)
references
References 35 publications
1
15
0
Order By: Relevance
“…Sidenmark and Gellersen [24] have demonstrated that eye-head interaction in virtual reality applications leads to fast gaze pointing and selection. Although those studies [21][22][23][24][25][26] examined the effectiveness of an eye gaze input system combined with key input or speech, they did not compare the system with one involving only an eye-gaze interface. That is, these studies did not aim at developing a system that could be executed with only an eye-gaze interface.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Sidenmark and Gellersen [24] have demonstrated that eye-head interaction in virtual reality applications leads to fast gaze pointing and selection. Although those studies [21][22][23][24][25][26] examined the effectiveness of an eye gaze input system combined with key input or speech, they did not compare the system with one involving only an eye-gaze interface. That is, these studies did not aim at developing a system that could be executed with only an eye-gaze interface.…”
Section: Discussionmentioning
confidence: 99%
“…Recent studies on eye-gaze interfaces [21][22][23][24][25][26] have demonstrated the effectiveness of such interfaces. Sidenmark and Gellersen [24] have demonstrated that eyehead interaction in virtual reality applications provides users with faster pointing and selection.…”
Section: Introductionmentioning
confidence: 99%
“…However, it may be possible, that the pointer would be easier to control using different techniques. Minataka et al [23] compared head pointing with gaze position tracking and foot gestures and identified head pointing as the most efficient interaction method for pointing tasks. However, hand gestures, as implemented by Feng et al [15], were not included in their experiment.…”
Section: Discussionmentioning
confidence: 99%
“…Using RGB and RGB-D camera sensors, it is also possible to determine the users' body location in space and use that information to select menu items or objects that are closer to users [48] or in front of the users [33]. Two studies used the users' feet to control a platform that acted like a mouse depending on the direction and angle of the feet on the platform, allowing the control of the direction of a virtual pointer [4,75]. Lastly, a single study was found where the users were able to perform the selection by eye gazing to an element and confirming the selection contracting the muscles of their arms [83], which was made possible using an electromyograph (EMG).…”
Section: Interaction Interfacesmentioning
confidence: 99%