2021
DOI: 10.3389/fnins.2021.634967
|View full text |Cite
|
Sign up to set email alerts
|

Manual Gestures Modulate Early Neural Responses in Loudness Perception

Abstract: How different sensory modalities interact to shape perception is a fundamental question in cognitive neuroscience. Previous studies in audiovisual interaction have focused on abstract levels such as categorical representation (e.g., McGurk effect). It is unclear whether the cross-modal modulation can extend to low-level perceptual attributes. This study used motional manual gestures to test whether and how the loudness perception can be modulated by visual-motion information. Specifically, we implemented a nov… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 70 publications
0
3
0
Order By: Relevance
“…From a theoretical perspective, this research topic holds particular significance due to the limited number of studies that have specifically investigated the interaction between gestures conveying phonological information and speech processing [8]. While there is a growing body of literature on speech and gesture interaction, most studies have predominantly focused on semantic [9,10], iconic [11], or prosodic [12,13] gestures. Consequently, there is a noticeable gap in the literature on AV integration examining the effect of manual gestures conveying phonological information on automatic speech processing.…”
Section: Introductionmentioning
confidence: 99%
“…From a theoretical perspective, this research topic holds particular significance due to the limited number of studies that have specifically investigated the interaction between gestures conveying phonological information and speech processing [8]. While there is a growing body of literature on speech and gesture interaction, most studies have predominantly focused on semantic [9,10], iconic [11], or prosodic [12,13] gestures. Consequently, there is a noticeable gap in the literature on AV integration examining the effect of manual gestures conveying phonological information on automatic speech processing.…”
Section: Introductionmentioning
confidence: 99%
“…Based on all the above studies, we conclude that there is no consensus regarding how gestures prime speech. Some researchers have suggested that gestures prime speech in the phonological phase ( Rauscher et al, 1996 ; Hadar and Butterworth, 1997 ; Krauss et al, 2000 ; Sun et al, 2021 ), some have found evidence only for the semantic phase of speech processing ( Wu and Coulson, 2005 ; Holle and Gunter, 2007 ; Ozyurek et al, 2007 ; He et al, 2020 ), others have found an effect in both phases ( Kelly et al, 2004 ), and still others have not discriminated between the two phases ( Habets et al, 2011 ; Obermeier et al, 2011 ; Obermeier and Gunter, 2015 ).…”
Section: Introductionmentioning
confidence: 99%
“…Recent electrophysiological studies suggest that gestures modulate amplitudes of evoked activities for both speech perception and sentence processing. At lower perceptual levels, co-speech gestures modulate the early N1-P2 components when single words are being processed (Kelly et al, 2004;Sun et al, 2021). At the semantic level, humans automatically integrate gesture and speech semantics during online processing, as reflected in the N400 component (Fabbri-Destro et al, 2015;Kelly et al, 2004;Özyürek et al, 2007;Willems et al, 2007;Wu & Coulson, 2005).…”
Section: Introductionmentioning
confidence: 99%