Proceedings of the 2010 Symposium on Eye-Tracking Research &Amp; Applications - ETRA '10 2010
DOI: 10.1145/1743666.1743705
|View full text |Cite
|
Sign up to set email alerts
|

Estimating 3D point-of-regard and visualizing gaze trajectories under natural head movements

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…Basically, there are two approaches. The first one, shown in Figure , is to visualize scanpaths in the 3D domain of the stimulus [DMC*02, SND10b, TKS*10, Pfe12, PSF*13a]. The other one is to warp the stimulus into a 2D representation and to draw scanpath lines on this 2D image [RTSB04].…”
Section: Point‐based Visualization Techniquesmentioning
confidence: 99%
“…Basically, there are two approaches. The first one, shown in Figure , is to visualize scanpaths in the 3D domain of the stimulus [DMC*02, SND10b, TKS*10, Pfe12, PSF*13a]. The other one is to warp the stimulus into a 2D representation and to draw scanpath lines on this 2D image [RTSB04].…”
Section: Point‐based Visualization Techniquesmentioning
confidence: 99%
“…Figure 2 shows the process of estimating a 3D POR. The process generates a triangular mesh of the PTAM points that are visible on the current frame [5,31]. The 3D POR X is calculated from the triangle X 0 X 1 X 2 of visible PTAM points Session: EyeWear Computing ISWC'13, September 9-12, 2013, Zurich, Switzerland which includes the intersection point…”
Section: D Point-of-regardsmentioning
confidence: 99%
“…This allows for triangulating its position in the world but this technique is clearly cumbersome when multiple points of interest are needed. This problem has been recently addressed by employing visual mapping (such as SLAM) techniques to estimate the user's movement and obtain directly 3D PORs [4,31].…”
Section: Introductionmentioning
confidence: 99%
“…The 3D PoG can be obtained easily from the 2D point by looking up the 3D coordinates of the pixel in the point cloud data struc ture provided by the RGB-D camera. Exploitation of the RGB-D point cloud structure removes the need for stereo eye tracking during 3D PoG estimation as used in [4,5].…”
Section: Point Of Gaze Estimationmentioning
confidence: 99%