2016
DOI: 10.1016/j.neuroimage.2016.02.026
|View full text |Cite
|
Sign up to set email alerts
|

Look into my eyes: Investigating joint attention using interactive eye-tracking and fMRI in a developmental sample

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

6
100
2
1

Year Published

2016
2016
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 72 publications
(109 citation statements)
references
References 77 publications
6
100
2
1
Order By: Relevance
“…The validity of the predictions of the models in the context of verbal communication was not self-evident. The way of eliciting eye contact in the current study differed in multiple aspects from the ways in previous studies (Conty et al , 2007; Ethofer et al , 2011; von dem Hagen et al , 2014; Cavallo et al , 2015; Oberwelland et al , 2016) on which the models are largely based. First, in our study eye contact occurred in a verbal listening context.…”
Section: Discussionmentioning
confidence: 90%
See 2 more Smart Citations
“…The validity of the predictions of the models in the context of verbal communication was not self-evident. The way of eliciting eye contact in the current study differed in multiple aspects from the ways in previous studies (Conty et al , 2007; Ethofer et al , 2011; von dem Hagen et al , 2014; Cavallo et al , 2015; Oberwelland et al , 2016) on which the models are largely based. First, in our study eye contact occurred in a verbal listening context.…”
Section: Discussionmentioning
confidence: 90%
“…We expect that our experimental paradigm together with other interactive approaches (e.g. Oberwelland et al , 2016) will be a solid foundation for integrating these features in future studies.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Similar tasks have been used in other fMRI studies using either gaze-contingent avatars (Oberwelland et al, in press) or live-video links to a real social partner (Redcay et al, 2012; Saito et al, 2010). Together, these interactive paradigms represent an important step towards an ecologically valid measure of joint attention.…”
Section: Introductionmentioning
confidence: 99%
“…When approached by a virtual character exhibiting an angry expression, increased activation was found in the participant’s superior temporal sulcus, lateral fusiform gyrus, and a region of the middle temporal gyrus [104]. Studies have also used virtual characters for studies of joint attention [18,105,106,107,108,109]. Schilbach and colleagues [18,108] have introduced dynamic virtual characters in the scanner to characterize the neural correlates of being involved in social interactions.…”
Section: Virtual Reality For Simulating Impossible Social Interactmentioning
confidence: 99%