2021
DOI: 10.1109/lra.2021.3097274
|View full text |Cite
|
Sign up to set email alerts
|

4D Attention: Comprehensive Framework for Spatio-Temporal Gaze Mapping

Abstract: This study presents a framework for capturing human attention in the spatio-temporal domain using eyetracking glasses. Attention mapping is a key technology for human perceptual activity analysis or Human-Robot Interaction (HRI) to support human visual cognition; however, measuring human attention in dynamic environments is challenging owing to the difficulty in localizing the subject and dealing with moving objects. To address this, we present a comprehensive framework, 4D Attention, for unified gaze mapping … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 33 publications
0
2
0
Order By: Relevance
“…Techniques considering 3D eye tracking data were mainly developed for VR scenarios (e.g., Stellmach et al [SND10a]) or were included in desktop applications [OKYB21]. Possible gaze visualizations include 3D scan paths as well as 3D attention maps [SND10b].…”
Section: Related Workmentioning
confidence: 99%
“…Techniques considering 3D eye tracking data were mainly developed for VR scenarios (e.g., Stellmach et al [SND10a]) or were included in desktop applications [OKYB21]. Possible gaze visualizations include 3D scan paths as well as 3D attention maps [SND10b].…”
Section: Related Workmentioning
confidence: 99%
“…The creation of three-dimensional (3D) building models is an important topic not only in the fields of urban planning, landscaping, disaster management, and urban activity monitoring but also for commercial applications such as movies and virtual reality [1][2][3][4]. In recent years, the development of 3D building models has accelerated worldwide with the emergence of digital twins [5,6].…”
Section: Introductionmentioning
confidence: 99%