Proceedings of the 2nd Augmented Human International Conference 2011
DOI: 10.1145/1959826.1959832
|View full text |Cite
|
Sign up to set email alerts
|

Acquisition of 3D gaze information from eyeball movements using inside-out camera

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 5 publications
0
3
0
Order By: Relevance
“…First, we need to estimate the 3-D gaze point by using the insideout camera system proposed in [8]. The process of estimating the 3-D gaze point in the camera coordinate system has the following flow.…”
Section: -D Gaze Point Estimationmentioning
confidence: 99%
See 1 more Smart Citation
“…First, we need to estimate the 3-D gaze point by using the insideout camera system proposed in [8]. The process of estimating the 3-D gaze point in the camera coordinate system has the following flow.…”
Section: -D Gaze Point Estimationmentioning
confidence: 99%
“…When these head-mounted eye-tracking systems are used, point-of-regard (POR) is generally measured as a point on the image plane. To acquire a 3-D POR, an inside-out camera system was proposed by Shimizu et al [8]. The inside-out camera system can a recover 3-D POR only under static state.…”
Section: Introductionmentioning
confidence: 99%
“…(2) In the mapping approach, a complementary device, such as the Kinect (Microsoft, USA) or another stereovision system with a SLAM algorithm [Paletta et al 2013] is used to map the environment. The eye tracking data is only used to select the target inside the field of view of the said device [Frisoli et al 2012;Atienza and Zelinsky 2003;Shimizu and Fujiyoshi 2011]. Thanks to the accuracy of stereovision systems [Peng 2011;Andersen et al 2012;Microsoft 2016], this method is more suitable for large workspaces.…”
Section: Introductionmentioning
confidence: 99%