Proceedings of the Symposium on Eye Tracking Research and Applications 2012
DOI: 10.1145/2168556.2168578
|View full text |Cite
|
Sign up to set email alerts
|

Eye-based head gestures

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 58 publications
(30 citation statements)
references
References 12 publications
0
29
1
Order By: Relevance
“…In contrast to [Mardanbegi et al 2012] where authors used a head-mounted eye tracker, we concluded that no algorithms to filter out eye movements from EP CV are needed. Also, we found that horizontal and vertical head movements of small amplitude did not notably influence the measured point of gaze.…”
Section: Head Movementscontrasting
confidence: 60%
See 1 more Smart Citation
“…In contrast to [Mardanbegi et al 2012] where authors used a head-mounted eye tracker, we concluded that no algorithms to filter out eye movements from EP CV are needed. Also, we found that horizontal and vertical head movements of small amplitude did not notably influence the measured point of gaze.…”
Section: Head Movementscontrasting
confidence: 60%
“…To our knowledge, the only attempts to exploit this parameter are the recent studies [Mardanbegi et al 2012;Špakov and Majaranta 2012] where the eye position in the camera view was used to implement simple head gestures that could enhance gaze interaction. For example, a simple nod could be used to select the target pointed by gaze.…”
Section: Introductionmentioning
confidence: 99%
“…Mardenbegi et al demonstrated the use of head gestures in combination with gaze to interact with applications on a public display [15]. This work followed the same principles as in the previously described work on pointing.…”
Section: Multi-modal Gaze Interaction With Public Displaysmentioning
confidence: 94%
“…Head-worn eyetracking has been used to map gaze to any planar digital display in a real-world environment [Mardanbegi and Hansen 2011]. Further work used nodding gestures combined with gaze to issue commands in remote applications [Mardanbegi et al 2012]. Interaction with eye-gaze on portable devices poses additional challenges.…”
Section: Gaze-supported Interactionmentioning
confidence: 99%