2014 22nd International Conference on Pattern Recognition 2014
DOI: 10.1109/icpr.2014.327
|View full text |Cite
|
Sign up to set email alerts
|

Appearance-Based Gaze Tracking with Free Head Movement

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 11 publications
0
12
0
Order By: Relevance
“…Lai et al [ 8 ] use random forests to learn the neighborhood structure for their joint head pose and eye appearance feature (HPEA). Gaze is estimated with linear interpolation using the neighbors in the random forest, yielding an accuracy of around 4.8° (horizontal and vertical combined).…”
Section: Methods For Single Camera Remote Gaze Trackingmentioning
confidence: 99%
See 2 more Smart Citations
“…Lai et al [ 8 ] use random forests to learn the neighborhood structure for their joint head pose and eye appearance feature (HPEA). Gaze is estimated with linear interpolation using the neighbors in the random forest, yielding an accuracy of around 4.8° (horizontal and vertical combined).…”
Section: Methods For Single Camera Remote Gaze Trackingmentioning
confidence: 99%
“…Even in a more recent survey [ 6 ] where both infrared (IR) and visible light methods are considered, the latter group is considered as just an alternative, and its subcategories are left unclear. Other categorization schemes also build on this ambiguity: appearance-based versus feature-based [ 7 , 8 ] and appearance-based versus model-based [ 9 , 10 ]. It should also be noted that the “appearance-based” name is still being used to refer to all visible light methods [ 11 , 12 ], adding to the confusion.…”
Section: Categorization and Structure Of Visible Light Gaze Trackementioning
confidence: 99%
See 1 more Smart Citation
“…The resulting dense data set is subsequently used to cross-train a random forest-based algorithm to estimate the gaze from newly observed eye images captured under head movement. Lai et al [149] employ a randomforest based technique as well, trained on image data collected by requesting the user to hold a fixed gaze and perform different head rotations, in order to select neighbours on a low-dimensional manifold given the observed head pose and eye appearance features. The selected neighbouring features under similar eye and head rotation angles are then used as input to an ALR technique, as previously described in Section 5.2.1, in order to estimate the gaze direction.…”
Section: Compensation For Head Rotationmentioning
confidence: 99%
“…Moreover, as they usually require a controlled environment to prevent undesired reflections in the eyes, their applicability during day time is precluded. Other common approaches employ 3D techniques [14] (using multiple cameras or depth sensors) and wearable devices such as helmets or glasses [15], being cumbersome for the users. Universal gaze tracking from completely unobtrusive, Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/image remotely located low-cost sensors (e.g.…”
Section: Introductionmentioning
confidence: 99%