2022
DOI: 10.1016/j.patcog.2022.108944
|View full text |Cite
|
Sign up to set email alerts
|

In the eye of the beholder: A survey of gaze tracking techniques

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 31 publications
(15 citation statements)
references
References 76 publications
0
11
0
Order By: Relevance
“…Since the previous two mentioned approaches rely on some sensors such as infrared or electrodes, they can be grouped under the term sensor-based eye tracking technologies. The other eye-tracking methods rely on computer-vision techniques for detecting and tracking the human eyes in the captured video frames by a camera in real-time without direct contact with human eyes or using extra hardware sensors 31 33 .
Figure 1 Eye-tracking systems for patients with speech impairments.
…”
Section: Related Workmentioning
confidence: 99%
“…Since the previous two mentioned approaches rely on some sensors such as infrared or electrodes, they can be grouped under the term sensor-based eye tracking technologies. The other eye-tracking methods rely on computer-vision techniques for detecting and tracking the human eyes in the captured video frames by a camera in real-time without direct contact with human eyes or using extra hardware sensors 31 33 .
Figure 1 Eye-tracking systems for patients with speech impairments.
…”
Section: Related Workmentioning
confidence: 99%
“…In order to track gaze of the user, in addition to location of the PC, GC location is required to establish a mapping between PC-GC vector and screen [11]. In this work we detect GC location with the same algorithm as of PC detection.…”
Section: Gc Detectionmentioning
confidence: 99%
“…This includes behavioral data such as intonation of speech (Linneman, 2013), the contents of conversations (Turmunkh et al., 2019), and, as we investigate here, response times. In addition, it is also possible to extract physiological data from video including facial expression (Canedo & Neves, 2019; Kumari et al., 2015), gaze location (Liu et al., 2022), and breathing (Janssen et al., 2015; Tran et al., 2017). Even heart rate can, in some cases, be extracted from video by measuring changes in the redness of the skin (Wu et al., 2012).…”
Section: Introductionmentioning
confidence: 99%