2016
DOI: 10.1093/scan/nsw127
|View full text |Cite
|
Sign up to set email alerts
|

Neural mechanisms of eye contact when listening to another person talking

Abstract: Eye contact occurs frequently and voluntarily during face-to-face verbal communication. However, the neural mechanisms underlying eye contact when it is accompanied by spoken language remain unexplored to date. Here we used a novel approach, fixation-based event-related functional magnetic resonance imaging (fMRI), to simulate the listener making eye contact with a speaker during verbal communication. Participants’ eye movements and fMRI data were recorded simultaneously while they were freely viewing a pre-rec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

4
51
1

Year Published

2018
2018
2023
2023

Publication Types

Select...
6

Relationship

1
5

Authors

Journals

citations
Cited by 31 publications
(56 citation statements)
references
References 60 publications
4
51
1
Order By: Relevance
“…Like Jiang et al (2016), we observed a preference for eye observations in the occipital lobe and adjacent inferior ventral and parieto-occipital cortex.…”
Section: Discussionsupporting
confidence: 50%
See 3 more Smart Citations
“…Like Jiang et al (2016), we observed a preference for eye observations in the occipital lobe and adjacent inferior ventral and parieto-occipital cortex.…”
Section: Discussionsupporting
confidence: 50%
“…While we were most interested in the anterior pSTS mouth-preferring region, our study complements the recent study of Jiang et al (2016) who focused on brain regions, including posterior pSTS, that were more active when participant fixated the eyes of talking faces. Like Jiang et al (2016), we observed a preference for eye observations in the occipital lobe and adjacent inferior ventral and parieto-occipital cortex.…”
Section: Discussionmentioning
confidence: 68%
See 2 more Smart Citations
“…Some studies reported that individuals with ASD gaze less to the face and the mouth during visual-speech recognition compared to typically developing controls (Irwin & Brancazio, 2014;Irwin, Tornatore, Brancazio, & Whalen, 2011, but see Foxe et al, 2015Saalasti et al, 2012). Since gaze behavior influences brain responses to faces (Dalton et al, 2005;Jiang, Borowiak, Tudge, Otto, & von Kriegstein, 2017), we used an eye tracker in the MRI environment to assess where participants looked during visual-speech recognition.…”
mentioning
confidence: 99%