2020
DOI: 10.3758/s13414-020-02042-x
|View full text |Cite
|
Sign up to set email alerts
|

Audio-visual integration in noise: Influence of auditory and visual stimulus degradation on eye movements and perception of the McGurk effect

Abstract: Seeing a talker’s face can aid audiovisual (AV) integration when speech is presented in noise. However, few studies have simultaneously manipulated auditory and visual degradation. We aimed to establish how degrading the auditory and visual signal affected AV integration. Where people look on the face in this context is also of interest; Buchan, Paré and Munhall (Brain Research, 1242, 162–171, 2008) found fixations on the mouth increased in the presence of auditory noise whilst Wilson, Alsius, Paré and Munhall… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
11
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 17 publications
(14 citation statements)
references
References 43 publications
2
11
1
Order By: Relevance
“…We know from previous studies that visual context is more beneficial as acoustic degradation or noisy background increases [1][2][3]30], consistent with the principle of inverse effectiveness in multisensory integration [31,32]; but see [33]. Increased adversity in the acoustical environment may cause individuals to concentrate more on mouth movements [34,35], leading to greater engagement of the visual modality with the auditory modality. By using 24-channels vocoding, we were hoping to balance listening effort between the natural and degraded speech tokens, to rule out listening effort as a confound, while at the same time still examining neurophysiological sensitivity to semantic prediction.…”
Section: Discussionsupporting
confidence: 60%
“…We know from previous studies that visual context is more beneficial as acoustic degradation or noisy background increases [1][2][3]30], consistent with the principle of inverse effectiveness in multisensory integration [31,32]; but see [33]. Increased adversity in the acoustical environment may cause individuals to concentrate more on mouth movements [34,35], leading to greater engagement of the visual modality with the auditory modality. By using 24-channels vocoding, we were hoping to balance listening effort between the natural and degraded speech tokens, to rule out listening effort as a confound, while at the same time still examining neurophysiological sensitivity to semantic prediction.…”
Section: Discussionsupporting
confidence: 60%
“…In sum, previous research by Alsius et al ( 2005 ), Gurler et al ( 2015 ), Munhall et al ( 2009 ), and Stacey et al ( 2020 ) suggests that the degree to which people perceive the McGurk illusion depends (1) on their attention in general (with less attention leading to a decrease of the illusion) and (2) on their attentional focus on the speaker’s mouth versus elsewhere (with a focus on the mouth leading to an increase of the illusion).…”
Section: Introductionmentioning
confidence: 54%
“…In sum, previous research by Alsius et al (2005), Gurler et al (2015), Munhall et al (2009), andStacey et al (2020) suggests that the degree to which people perceive the McGurk illusion depends (1) on their attention in general (with less attention leading to a decrease of the illusion) and…”
Section: Multisensory Integration In Speech Perceptionmentioning
confidence: 82%
See 2 more Smart Citations