2006
DOI: 10.1093/cercor/bhl141
|View full text |Cite
|
Sign up to set email alerts
|

When Language Meets Action: The Neural Integration of Gesture and Speech

Abstract: Although generally studied in isolation, language and action often co-occur in everyday life. Here we investigated one particular form of simultaneous language and action, namely speech and gestures that speakers use in everyday communication. In a functional magnetic resonance imaging study, we identified the neural networks involved in the integration of semantic information from speech and gestures. Verbal and/or gestural content could be integrated easily or less easily with the content of the preceding pa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

27
247
6
5

Year Published

2007
2007
2015
2015

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 295 publications
(285 citation statements)
references
References 91 publications
27
247
6
5
Order By: Relevance
“…pictures, line drawings, or gestures) into a context have reported a more frontal scalp distribution for the N400 effects (e.g. Federmeier, and Kutas, 2001;Ganis, Kutas, and Sereno, 1996;Holle, and Gunter, 2007;Willems et al, 2007). Here, it is suggested that in Experiment 2 the additional attempt to predict upcoming words involves (parts of) the language production system (Pickering, and Garrod, 2007).…”
Section: Discussionmentioning
confidence: 82%
“…pictures, line drawings, or gestures) into a context have reported a more frontal scalp distribution for the N400 effects (e.g. Federmeier, and Kutas, 2001;Ganis, Kutas, and Sereno, 1996;Holle, and Gunter, 2007;Willems et al, 2007). Here, it is suggested that in Experiment 2 the additional attempt to predict upcoming words involves (parts of) the language production system (Pickering, and Garrod, 2007).…”
Section: Discussionmentioning
confidence: 82%
“…For instance, in the case of language comprehension, it has become increasingly clear that the brain uses several types of information in a qualitatively similar way to arrive at a full understanding of a message. This includes information from world-knowledge, co-speech gestures, pictures, speaker's identity derived from voice characteristics and information from a preceding discourse (Federmeier & Kutas, 2001;Hagoort, 2005;Hagoort, Hald, Bastiaansen, & Petersson, 2004;Hagoort & van Berkum, in press;Nieuwland & van Berkum, 2006;Ö zyü rek et al, in press;van Berkum, Hagoort, & Brown, 1999;van Berkum, Zwitserlood, Hagoort, & Brown, 2003;Willems et al, 2006). Importantly, these examples serve to demonstrate that the brain not only is capable of taking several streams of information into account, but actually does so in a qualitatively similar way.…”
Section: Discussionmentioning
confidence: 97%
“…However, a remaining question is how comparable the semantic processing evoked by hand gestures is to that of linguistic items such as words. This was investigated in a pair of studies measuring the neural time course (using ERPs) as well as the neural locus (using fMRI) of the sentence level integration of co-speech gestures and spoken words (Ö zyü rek, Willems, Kita, & Hagoort, in press;Willems, Ö zyü rek, & Hagoort, 2006). Subjects heard sentences in which a critical word was accompanied by a gesture.…”
Section: Co-speech Gesturesmentioning
confidence: 99%
See 1 more Smart Citation
“…Villarreal and colleagues [5] assessed cortical activity during recognition of communicative gestures containing symbolic connotations (e.g., The embodied theory of language assumes that language comprehension makes use of the neural system ordinarily recruited for action control [6]. Focusing on spoken language material related to concrete actions, recent neurophysiological studies have shown that premotor regions are involved in language processing [7]. Also, in keeping with the involvement of the motor system in processing action-related material, the results reported by Buccino et al [8] in a single pulse TMS study, have shown that motor evoked potentials (MEPs) recorded from hand muscles are modulated during listening to hand-related action sentences.…”
Section: Introductionmentioning
confidence: 99%