ACM Symposium on Eye Tracking Research and Applications 2020
DOI: 10.1145/3379155.3391312
|View full text |Cite
|
Sign up to set email alerts
|

BimodalGaze: Seamlessly Refined Pointing with Gaze and Filtered Gestural Head Movement

Abstract: Figure 1: BimodalGaze enables users to point by gaze and to seamlessly refine the cursor position with head movement. A: In Gaze Mode, the cursor (yellow) follows where the user looks but may not be sufficiently accurate. B: The pointer automatically switches into Head Mode (green) when gestural head movement is detected. C: The pointer automatically switches back into Gaze Mode when the user redirects their attention. Note that the Head Mode is only invoked when needed for adjustment of the cursor. Any natura… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
43
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 34 publications
(46 citation statements)
references
References 32 publications
1
43
0
Order By: Relevance
“…Most of the previous studies only supported simple tasks such as pointing and selection by combining eye gazing and head gestures [23], [24], [34]- [36]. Sidenmark and Gellersen [23] proposed leveraging the synergetic movement of eye and head and identified design principles for Eye&Head gaze interaction.…”
Section: A Eye Gaze and Head Movement-based Interactions In Pointing And Selectionmentioning
confidence: 99%
See 2 more Smart Citations
“…Most of the previous studies only supported simple tasks such as pointing and selection by combining eye gazing and head gestures [23], [24], [34]- [36]. Sidenmark and Gellersen [23] proposed leveraging the synergetic movement of eye and head and identified design principles for Eye&Head gaze interaction.…”
Section: A Eye Gaze and Head Movement-based Interactions In Pointing And Selectionmentioning
confidence: 99%
“…Recently, several studies were conducted to utilize eye gazing and head gestures for hands-free interactions because eye-tracking sensors and gyro-sensors are embedded in smart devices. Sidenmark and Gellersen [23], Sidenmark et al [24] leveraged the synergetic movement of eye and head with naturally combined eye-head movement and refined the cursor position with gestural head movement. However, they performed only simple tasks such as selection, and they did not perform more complicated tasks such as 3D manipulation with 3D virtual objects.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Both rigid shapes (e.g., rectangular [107]) or circular [108] and flexible shapes (e.g., [109]) have been used, as well as various display media (e.g., projection on cardboard [17,107]), transparent props [12,98], handheld touchscreens [40,59], or virtual lenses [64,82]. In addition, the combination of eye-gaze with other modalities such as touch [85,86], mid-air gestures [87,97,101] and head-movements [56,103,104] has been recently investigated for interaction in spatial user interfaces. For a recent survey on gaze-based interaction in AR and VR, see Hirzle et al [47].…”
Section: Spatial Interactionmentioning
confidence: 99%
“…Pursuits are also suitable for hands-free object selections in virtual environments [7], or even selecting occluded objects [21]. Recent literature also gives insights on utilizing eye gaze compensated with head movements for gaze-based selections for improving targeting precision [10], enhancing usability [11,23], and freeing eye gaze exploration [22].…”
Section: Eye Tracking and Controlmentioning
confidence: 99%