2023
DOI: 10.1080/23273798.2023.2295499
|View full text |Cite
|
Sign up to set email alerts
|

The role of co-speech gestures in retrieval and prediction during naturalistic multimodal narrative processing

Sergio Osorio,
Benjamin Straube,
Lars Meyer
et al.

Abstract: During daily communication, visual cues such as gestures accompany the speech signal and facilitate semantic processing. However, how gestures impact lexical retrieval and semantic prediction, especially in a naturalistic setting, remains unclear. Here, participants watched a naturalistic multimodal narrative, where an actor narrated a story and spontaneously produced co-speech gestures. For all content words, word frequency and lexical surprisal were regressed against the EEG using temporal response functions… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

1
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 89 publications
1
3
0
Order By: Relevance
“…12(2)(2024): 319-348 and 4) signalling thinking processes with metaphoric gestures. In this way, the findings align with previous studies on how co-speech gestures can offer both communicative and cognitive functions when at work (see Congdon et al, 2017;Fritz et al, 2021;Goldin-Meadow, 2005;Kita et al, 2017;Kita & Özyürek, 2003;Osorio et al, 2024). Co-speech gestures which are representational of the message content (e.g., iconic and deictic) can enhance the engineering presentations by reducing the cognitive load of processing them.…”
Section: Discussionsupporting
confidence: 88%
See 1 more Smart Citation
“…12(2)(2024): 319-348 and 4) signalling thinking processes with metaphoric gestures. In this way, the findings align with previous studies on how co-speech gestures can offer both communicative and cognitive functions when at work (see Congdon et al, 2017;Fritz et al, 2021;Goldin-Meadow, 2005;Kita et al, 2017;Kita & Özyürek, 2003;Osorio et al, 2024). Co-speech gestures which are representational of the message content (e.g., iconic and deictic) can enhance the engineering presentations by reducing the cognitive load of processing them.…”
Section: Discussionsupporting
confidence: 88%
“…Though speech can exist independently of gestures, the incorporation of gestures may help to enhance speech production and problem-solving (Beilock & Goldin-Meadow, 2010;Osorio et al, 2024). When conveying spatial information, adults become less fluent when the hands are not free to gesture, and such disruptions are often compensated with more repetitive verbal messages (Rauscher et al, 1996).…”
Section: Introductionmentioning
confidence: 99%
“…In this respect, the distinction between human and model processing is incomparable. Lastly, although each word in this study was presented for a relatively extended period, the neural activity for the current word still depended, to some extent, on the words that were previously presented [45,46]. This interdependence is a consideration for this research and serves as a caveat for current findings.…”
Section: Discussionmentioning
confidence: 91%
“…Such studies have been important in helping us understand how the brain processes music and speech, but under naturalistic scenarios, music and speech signals are far less periodic, are embedded in noise, and compete for attentional resources with input from other sensory modalities. For this reason, there have been recent calls and attempts to investigate brain responses to speech and music using less controlled materials and in the context of sensory multimodality (Broderick et al, 2018; Hamilton & Huth, 2020; He et al, 2015; Kern et al, 2022; Osorio et al, 2024; Poikonen et al, 2018; Sturm et al, 2015; Willems et al, 2020). Whether cortical tracking extends to more natural conditions therefore requires further inquiry.…”
Section: Introductionmentioning
confidence: 99%