2014
DOI: 10.1145/2593689
|View full text |Cite
|
Sign up to set email alerts
|

Online 3D Gaze Localization on Stereoscopic Displays

Abstract: This article summarizes our previous work on developing an online system to allow the estimation of 3D gaze depth using eye tracking in a stereoscopic environment. We report on recent extensions allowing us to report the full 3D gaze position. Our system employs a 3D calibration process that determines the parameters of a mapping from a naive depth estimate, based simply on triangulation, to a refined 3D gaze point estimate tuned to a particular user. We show that our system is an improvement on the geometry-b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
22
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 26 publications
(23 citation statements)
references
References 17 publications
1
22
0
Order By: Relevance
“…The following pairwise comparisons with Bonferroni correction revealed significantly different errors between depths of 5 and 15, -5 and 15, and between 15 and -15 cm. Results generally corroborate previous findings of signed vergence error on stereoscopic displays [Wang et al 2013].…”
Section: Vergence Errors: Physical Vs Virtual Environmentsupporting
confidence: 90%
See 2 more Smart Citations
“…The following pairwise comparisons with Bonferroni correction revealed significantly different errors between depths of 5 and 15, -5 and 15, and between 15 and -15 cm. Results generally corroborate previous findings of signed vergence error on stereoscopic displays [Wang et al 2013].…”
Section: Vergence Errors: Physical Vs Virtual Environmentsupporting
confidence: 90%
“…Gaze depth, i.e., vergence, can be measured via spatial triangulation [Pfeiffer 2010]. The geometry of the approach is given by Wang et al [2013]. If a binocular eye tracker delivers two on-screen gaze points, (x l , y l ) for the left eye and (xr, yr) for the right, then the horizontal disparity ∆x = xr −x l is sufficient to estimate gaze depth z [Duchowski et al 2011].…”
Section: Previous Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Wang et al had studied this problem and proposed a 3D calibration method [6]. This method was adopted with little changes to fit to our actual 3D calibration situation.…”
Section: Creating Of 3d Fdm With Gaze Depthmentioning
confidence: 99%
“…They proposed a method to measure gaze depth in [3]. After that, a method to improve accuracy of eye-tracking data under stereo viewing was proposed in which they followed the eyetracker's protogenetic 2D calibration with a 3D calibration [6]. Another approach called Parametrized Self-Organizing Map (PSOM) proposed by Essig et al [7] was designed to solve the 3D calibration problem by mapping two 2D screen coordinates to a 3D gaze point.…”
Section: Introductionmentioning
confidence: 99%