Proceedings of the 2018 ACM Symposium on Eye Tracking Research &Amp; Applications 2018
DOI: 10.1145/3204493.3204536
|View full text |Cite
|
Sign up to set email alerts
|

Error-aware gaze-based interfaces for robust mobile gaze interaction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
23
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 29 publications
(23 citation statements)
references
References 40 publications
0
23
0
Order By: Relevance
“…The same group recently reported an average calibration accuracy of 3.9-4.34 • for this eye tracker (FOVE; [47]). The range of these values highlight the necessity of good eye tracking fdelity to achieve robust target selection, and newer approaches are already circumventing this limitation by adapting user interfaces (UIs) to the available gaze tracking performance, for example by dynamically changing the size of UI elements [2,14].…”
Section: Related Work and Background Eye Gaze For Selectionmentioning
confidence: 99%
“…The same group recently reported an average calibration accuracy of 3.9-4.34 • for this eye tracker (FOVE; [47]). The range of these values highlight the necessity of good eye tracking fdelity to achieve robust target selection, and newer approaches are already circumventing this limitation by adapting user interfaces (UIs) to the available gaze tracking performance, for example by dynamically changing the size of UI elements [2,14].…”
Section: Related Work and Background Eye Gaze For Selectionmentioning
confidence: 99%
“…Gaze-tracking equipment has become robust, inexpensive and accurate, and gaze sensors may soon be included in computers, phones, tablets, and head-mounted displays (Barz et al, 2018;Hu & Lodewijks, 2020). The use of eyebased physiological signals derived from blinks (Martins & Carvalho, 2015;Stern et al, 1984), eye movements (Cazzoli et al, 2014;Hirvonen et al, 2010), and pupils (Marandi et al, 2019;Morad et al, 2000), recorded using eye-tracking (Hopstaken et al, 2015b;Itoh et al, 2000;Li, Chen et al, 2019;Maffei & Angrilli, 2018;Marandi et al, 2018) and electro-oculography (EOGHirvonen et al, 2010;Tag et al, 2019), have received increased scrutiny in the past years as a means of measuring fatigue (Dawson et al, 2014;Eckstein et al, 2017;Kramer, 1991;Martins & Carvalho, 2015).…”
Section: Introductionmentioning
confidence: 99%
“…However, fixation-wise annotation does not remedy the need to annotate AOIs in every recording of every participant. A solution can be found in attaching fiducial markers to, e.g., a target stimulus in 2D [ 3 ] and 3D [ 4 ], an interactive area [ 5 ], or tangible objects [ 6 ]. In this research, we aim at circumventing the requirement to instrument the environment with obtrusive markers.…”
Section: Introductionmentioning
confidence: 99%