2023
DOI: 10.1101/2023.03.14.532608
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Objects guide human gaze behavior in dynamic real-world scenes

Abstract: The complexity of natural scenes makes it challenging to experimentally study the mechanisms behind human gaze behavior when viewing dynamic environments. Historically, eye movements were believed to be driven primarily by bottom-up saliency, but increasing evidence suggests that objects also play a significant role in guiding attention. We present a new computational framework to investigate the importance of objects for attentional guidance. This framework is designed to simulate realistic scanpaths for dyna… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 160 publications
(231 reference statements)
0
0
0
Order By: Relevance
“…As to the analyses, a number of works have recently considered eye movements modelling from the foraging perspective (e.g., [23,8,7,22,20,9]) or the closely related one that exploits sequential decision-making based on drift-diffusion models (e.g. [10,31]). With respect to the analyses needed here such models are overly complex, given their aim of actually simulating gaze shifts, and/or just suitable to cope with simple visual stimuli.…”
Section: Methodsmentioning
confidence: 99%
“…As to the analyses, a number of works have recently considered eye movements modelling from the foraging perspective (e.g., [23,8,7,22,20,9]) or the closely related one that exploits sequential decision-making based on drift-diffusion models (e.g. [10,31]). With respect to the analyses needed here such models are overly complex, given their aim of actually simulating gaze shifts, and/or just suitable to cope with simple visual stimuli.…”
Section: Methodsmentioning
confidence: 99%