2021
DOI: 10.1101/2021.07.27.454022
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Action Affordance Affects Proximal and Distal Goal-Oriented Planning

Abstract: Seminal studies on human cognitive behavior have been conducted in controlled laboratory settings, demonstrating that visual attention is mainly goal-directed and allocated based on the action performed. However, it is unclear how far these results generalize to cognition in more naturalistic settings. The present study investigates active inference processes revealed by eye movements during interaction with familiar and novel tools with two levels of realism of the performed action. We presented participants … Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(7 citation statements)
references
References 47 publications
0
7
0
Order By: Relevance
“…Using our eye movement classification algorithm, we showed that we could accurately classify eye movements of three-dimensional free-exploration data and that we can generate fERPs and fERSPs, proving that combining EEG and free-viewing virtual reality setups is possible. We investigated the classification quality using our modified version of a velocity-based classification algorithm (Dar et al, 2021;Keshava et al, 2023;Voloh et al, 2020), correcting for subject movement in the virtual environment. Furthermore, we compared two data-segmentation methods dealing with varying noise levels across a long recording.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…Using our eye movement classification algorithm, we showed that we could accurately classify eye movements of three-dimensional free-exploration data and that we can generate fERPs and fERSPs, proving that combining EEG and free-viewing virtual reality setups is possible. We investigated the classification quality using our modified version of a velocity-based classification algorithm (Dar et al, 2021;Keshava et al, 2023;Voloh et al, 2020), correcting for subject movement in the virtual environment. Furthermore, we compared two data-segmentation methods dealing with varying noise levels across a long recording.…”
Section: Discussionmentioning
confidence: 99%
“…We, therefore, introduced a movement correction before calculating the eye angular velocity (Keshava et al, 2023) used to define gazes and saccades. The key to our approach is first to compute the hit point's movement in allocentric coordinates and then translate it to the required change of eye direction in allocentric coordinates.…”
Section: Eye-tracking Preprocessing and Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…In recent years, virtual reality (VR) and mobile sensing have offered great opportunity to create controlled, natural environments. Here, subjects' eye and body movements can be measured reliably along with their interactions with the environment (Keshava et al, 2020(Keshava et al, , 2021Clay et al, 2019;Mann et al, 2019). Experiments in virtual environments have grown popular in recent years and have shown promise towards studying cognition in naturalistic and controlled environments.…”
Section: Introductionmentioning
confidence: 99%