2015
DOI: 10.1162/jocn_a_00688
|View full text |Cite
|
Sign up to set email alerts
|

Multisensory Integration: The Case of a Time Window of Gesture–Speech Integration

Abstract: This experiment investigates the integration of gesture and speech from a multisensory perspective. In a disambiguation paradigm, participants were presented with short videos of an actress uttering sentences like "She was impressed by the BALL, because the GAME/DANCE...." The ambiguous noun (BALL) was accompanied by an iconic gesture fragment containing information to disambiguate the noun toward its dominant or subordinate meaning. We used four different temporal alignments between noun and gesture fragment:… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
30
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 28 publications
(34 citation statements)
references
References 73 publications
(145 reference statements)
4
30
0
Order By: Relevance
“…Note for instance, that the PLV time profiles peak at word onset, showing a precise timing of the gesture effect on auditory speech. This temporal pattern coincides well with studies that investigated the time window of gesture-speech integration from an ERP perspective (Habets et al, 2011;Obermeier et al, 2011;Obermeier & Gunter, 2014). Moreover, please note that the two relevant conditions in our study (words with gesture and words without gesture, in the AV modality) contained body motion visual information (the upper part of the speaker's body was visible both conditions), with the only difference being the presence / absence of a hand beat gesture.…”
Section: -Discussionsupporting
confidence: 90%
See 2 more Smart Citations
“…Note for instance, that the PLV time profiles peak at word onset, showing a precise timing of the gesture effect on auditory speech. This temporal pattern coincides well with studies that investigated the time window of gesture-speech integration from an ERP perspective (Habets et al, 2011;Obermeier et al, 2011;Obermeier & Gunter, 2014). Moreover, please note that the two relevant conditions in our study (words with gesture and words without gesture, in the AV modality) contained body motion visual information (the upper part of the speaker's body was visible both conditions), with the only difference being the presence / absence of a hand beat gesture.…”
Section: -Discussionsupporting
confidence: 90%
“…Previous ERP studies have investigated this temporal window of integration between concurrent gestures describing an object (i.e. iconic gestures) and speech (Habets et al, 2011;Obermeier, Holle & Gunter, 2011;Obermeier & Gunter, 2014). For instance, Obermeier & Gunter (2014) showed that the ERP signature of semantic integration between gesture and speech was affected by incongruence when an approximate temporal overlap between the gesture fragment and its affiliate word was maintained (between -200ms to +120ms around identification point of word).…”
Section: -Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Studies using electrophysiological measures of brain activity and event-related potentials (ERPs), that is more implicit measures of processing, have shown that semantic integration is affected by advancing gestures by more than 200 ms (Habets et al, 2011;Obermeier & Gunter, 2014).…”
Section: Speech-gesture Coordination In Receptionmentioning
confidence: 99%
“…Some studies record actors performing scripted gestures (e.g., Cassell, McNeill, & McCullough, 1999;Woodall & Burgoon, 1981). Others use video editing, combining different image sequences with the same audio track (e.g., Habets et al, 2011;Leonard & Cummins, 2011;Obermeier & Gunter, 2014), typically examining one gesture in isolation. This approach often requires the speaker's face to be masked to avoid distraction from asynchronous speech and lip movement.…”
Section: Speech-gesture Coordination In Receptionmentioning
confidence: 99%