2018
DOI: 10.1098/rsos.170909
|View full text |Cite
|
Sign up to set email alerts
|

Rapid recalibration of speech perception after experiencing the McGurk illusion

Abstract: The human brain can quickly adapt to changes in the environment. One example is phonetic recalibration: a speech sound is interpreted differently depending on the visual speech and this interpretation persists in the absence of visual information. Here, we examined the mechanisms of phonetic recalibration. Participants categorized the auditory syllables /aba/ and /ada/, which were sometimes preceded by the so-called McGurk stimuli (in which an /aba/ sound, due to visual /aga/ input, is often perceived as ‘ada’… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
8
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 9 publications
(10 citation statements)
references
References 42 publications
2
8
0
Order By: Relevance
“…Audiovisual information proved more effective than lexical cues in inducing subsequent retuning effects, in line with prior findings (Lüttke et al, 2018;Mitterer & Reinisch, 2017;. This difference is predicted given the visual salience of the /p/-/t/ contrast (a bilabial vs. an alveolar plosive) compared to the subtlety of the auditory difference between the same two sounds (both voiceless, both plosive).…”
Section: Discussionsupporting
confidence: 84%
“…Audiovisual information proved more effective than lexical cues in inducing subsequent retuning effects, in line with prior findings (Lüttke et al, 2018;Mitterer & Reinisch, 2017;. This difference is predicted given the visual salience of the /p/-/t/ contrast (a bilabial vs. an alveolar plosive) compared to the subtlety of the auditory difference between the same two sounds (both voiceless, both plosive).…”
Section: Discussionsupporting
confidence: 84%
“…In a ventriloquist paradigm, for example, the sight of the puppet and the actor’s voice are combined when localizing the speech source, and both cues influence the localization of subsequent unisensory acoustic cues, if probed experimentally (Bosen et al, 2017; Bosen et al, 2018; Bruns and Röder, 2015; Bruns and Röder, 2017; Callan et al, 2015; Radeau and Bertelson, 1974; Recanzone, 1998). This trial-by-trial recalibration of perception by previous multisensory information has been demonstrated for spatial cues, temporal cues, and speech signals (Kilian-Hütten et al, 2011a; Lüttke et al, 2016; Lüttke et al, 2018; Van der Burg et al, 2013), and has been shown to be modulated by attention (Eramudugolla et al, 2011). Despite the importance of both facets of multisensory perception for adaptive behavior - the combination of information within a trial and the trial-by-trial adjustment of perception - it remains unclear whether they originate from shared neural mechanisms.…”
Section: Introductionmentioning
confidence: 72%
“…One previous study combined both audiovisual and lexical cues in McGurk-style fusion percepts (e.g., auditory armabillo paired with visual armagillo resulting in a percept of the word armadillo), but these stimuli did not induce significant perceptual shifts (Samuel & Lieblich, 2014). McGurk-style fusion stimuli can lead to perceptual shifts (Lüttke, Pérez-Bellido, & de Lange, 2018;Roberts & Summerfield, 1981;Saldaña & Rosenblum, 1994), but such stimuli often combine clear audio of a syllable (/ba/) with an incongruent video of another syllable (such as /ga/), leading to an entirely new percept (/da/). The combination of lexical and audiovisual cues in these McGurk percepts may not allow for perceptual adjustments.…”
Section: Discussionmentioning
confidence: 96%