2009
DOI: 10.1002/hbm.20774
|View full text |Cite
|
Sign up to set email alerts
|

Co‐speech gestures influence neural activity in brain regions associated with processing semantic information

Abstract: Everyday communication is accompanied by visual information from several sources, including cospeech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory-only (speech alone) condition. In the first audiovisual cond… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

17
142
1
8

Year Published

2011
2011
2017
2017

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 140 publications
(168 citation statements)
references
References 130 publications
(201 reference statements)
17
142
1
8
Order By: Relevance
“…It seems reasonable that cospeech gesture production engages the gesture network because it combines conceptual, as well as skill-related aspects of actions that are essential for the meaningful hand movements of co-speech gesturing. Importantly, the second activity pattern includes fronto-parietal areas that are also frequently found in neuroimaging studies of co-speech gesture perception (Dick et al, 2009;Holle et al, 2008;Kircher et al, 2009;Straube et al, 2011;Willems et al, 2007Willems et al, , 2009). …”
Section: Discussionmentioning
confidence: 79%
See 1 more Smart Citation
“…It seems reasonable that cospeech gesture production engages the gesture network because it combines conceptual, as well as skill-related aspects of actions that are essential for the meaningful hand movements of co-speech gesturing. Importantly, the second activity pattern includes fronto-parietal areas that are also frequently found in neuroimaging studies of co-speech gesture perception (Dick et al, 2009;Holle et al, 2008;Kircher et al, 2009;Straube et al, 2011;Willems et al, 2007Willems et al, , 2009). …”
Section: Discussionmentioning
confidence: 79%
“…Their coordination with speech is both temporal-kinetic and semantic (Kita & Özyürek, 2003;Loehr, 2007) and they have been shown to affect learning and memory in the listener as well as the speaker (Goldin-Meadow, 2003;Hostetter, 2011;Marstaller & Burianová, 2013). Previous neuroimaging studies have found that the observation of co-speech gestures engages superior and middle temporal gyrus, intraparietal sulcus, and inferior frontal gyrus (Dick et al, 2009;Holle et al, 2008Holle et al, , 2010Hubbard et al, 2009;Kircher et al, 2009;Skipper et al, 2007Skipper et al, , 2009Straube et al, 2011;Willems et al, 2007Willems et al, , 2009). The findings from these studies strongly suggest that during the observation of co-speech gestures, frontal and temporal regions are engaged in semantic processing, whereas frontal and parietal areas are activated for action understanding (Marstaller & Burianová, 2014).…”
Section: Introductionmentioning
confidence: 99%
“…Generally, the left ventral premotor BA 6/44 complex is also involved in these task, but the lack of a formal comparison with corresponding linguistic conditions, as we have performed here, does not allow the question of hemispheric dominance for the two modalities to be tackled. Interestingly, recent studies (Dick, Goldin-Meadow, Hasson, Skipper, & Small, 2009;Green et al, 2009;Straube, Green, Weis, Chatterjee, & Kircher, 2009) found that the right inferior frontal gyrus displayed more activity when hand movements were semantically unrelated than when they were related to the accompanying speech. Altogether, these findings may be compatible with the results of our study, in which the participants were required to choose a compatible outcome for the intention revealed by the extralinguistic communicative gesture, and discard the noncompatible outcome.…”
Section: Communicative Modalitiesmentioning
confidence: 99%
“…The fact that we did not find pSTS activation in the conjunction analysis comparing a bimodal (i.e., speech + pointing) condition to the sum of the unimodal conditions may be due to the absence of motion in our visual stimuli (cf. Dick et al, 2009). The current study may serve as a baseline for future studies investigating the processing of pointing gestures and speech in more dynamic and interactive situations (cf.…”
Section: Discussionmentioning
confidence: 99%