2017
DOI: 10.1016/j.jml.2016.12.002
|View full text |Cite
|
Sign up to set email alerts
|

Cognitive load makes speech sound fast, but does not modulate acoustic context effects

Abstract: a b s t r a c tIn natural situations, speech perception often takes place during the concurrent execution of other cognitive tasks, such as listening while viewing a visual scene. The execution of a dual task typically has detrimental effects on concurrent speech perception, but how exactly cognitive load disrupts speech encoding is still unclear. The detrimental effect on speech representations may consist of either a general reduction in the robustness of processing of the speech signal ('noisy encoding'), o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

9
43
0
13

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
1
1

Relationship

5
4

Authors

Journals

citations
Cited by 51 publications
(65 citation statements)
references
References 48 publications
9
43
0
13
Order By: Relevance
“…We used a tACS montage targeting auditory cortices (Riecke, Sack, and Schroeder 2015;, suggesting that the observed effect occurs in auditory cortical areas involved in speech processing. This notion is corroborated by findings showing that phonological information may be decoded from early auditory oscillatory activity (Di Liberto, O'Sullivan, and Lalor 2015;Ten Oever and Sack 2015), and that behavioral perceptual biases induced by fast vs. slow speech rhythms arise early in perception (Maslowski, Meyer, and Bosker 2019) and independently from attention (Bosker, Reinisch, and Sjerps 2017). Our results show no significant effect of tACS phase on vowel perception.…”
Section: Discussionsupporting
confidence: 88%
“…We used a tACS montage targeting auditory cortices (Riecke, Sack, and Schroeder 2015;, suggesting that the observed effect occurs in auditory cortical areas involved in speech processing. This notion is corroborated by findings showing that phonological information may be decoded from early auditory oscillatory activity (Di Liberto, O'Sullivan, and Lalor 2015;Ten Oever and Sack 2015), and that behavioral perceptual biases induced by fast vs. slow speech rhythms arise early in perception (Maslowski, Meyer, and Bosker 2019) and independently from attention (Bosker, Reinisch, and Sjerps 2017). Our results show no significant effect of tACS phase on vowel perception.…”
Section: Discussionsupporting
confidence: 88%
“…Context effects have furthermore been shown to hold even for 2–4 months old infants [26] and non-human species [27]. Lastly, effects of adjacent rate contexts are unaffected by attentional modulation, which supports the involvement of early perceptual processes [28]. …”
Section: Introductionmentioning
confidence: 94%
“…in English, short /b/ vs. long /w/; in Dutch, short /ɑ/ vs. long /a:/) may be biased towards the longer phoneme (i.e. /w/ in English; /a:/ in Dutch) if presented after a preceding sentence (hereafter: carrier) produced at a faster speech rate (Bosker, Reinisch, & Sjerps, 2017;Kidd, 1989;Pickett & Decker, 1960;Reinisch & Sjerps, 2013;Toscano & McMurray, 2015). This process, known as rate normalisation, has been argued to involve generalauditory processes, since it occurs in human and nonhuman species (Dent, Brittan-Powell, Dooling, & Pierce, 1997), is induced by talker-incongruent contexts (Bosker, 2017b;Newman & Sawusch, 2009), and even by non-speech (Bosker, 2017a;Gordon, 1988;Wade & Holt, 2005); in contrast to other rate-dependent perceptual effects, such as the Lexical Rate Effect (Dilley & Pitt, 2010;Pitt, Szostak, & Dilley, 2016).…”
Section: Neural Entrainment Shapes Perception Of Subsequent Speechmentioning
confidence: 99%