14Navigating "cocktail party" situations by enhancing foreground sounds over irrelevant background 15 information is typically considered from a cortico-centric perspective. However, subcortical circuits, 16 such as the medial olivocochlear reflex (MOCR) that modulates inner ear activity itself, have ample 17 opportunity to extract salient features from the auditory scene prior to any cortical processing. To 18 understand the contribution of auditory subcortical nuclei and the cochleae, physiological 19 recordings were made along the auditory pathway while listeners differentiated non(sense)-words 20 and words. Both naturally-spoken and intrinsically-noisy, vocoded speechfiltering that mimics 21 processing by a cochlear implant-significantly activated the MOCR, whereas listening to speech-22 in-background noise revealed instead engagement of midbrain and cortical resources. An auditory 23 periphery model reproduced these speech degradation-specific effects, providing a rationale for 24 goal-directed MOCR gating to enhance representation of speech features in the auditory nerve.
25These results highlight two strategies co-existing in the auditory system to accommodate 26 categorically different speech degradations. 27 78 cortical auditory responses in both the active listening task and when listeners were required to 79 ignore the task and watch a silent, non-subtitled film.
80Maintaining a fixed task difficulty across speech manipulations, we found measures of hearing 81 function at the level of the cochlea, brainstem, midbrain and cortex to be modulated differently 82 depending on the degradation type applied to speech sounds, and on whether or not speech was 83 3 actively attended. Specifically, the MOCR, assessed in terms of the magnitude of click-evoked 84 OAEs (CEOAEs), was activated by vocoded speech-an intrinsically degraded speech signal-but 85 not by otherwise 'clean' speech presented in either babble-noise or speech-shaped noise. Neural 86 activity at the first synaptic stage of central processing in the cochlear nucleus (CN)-assessed 87 physiologically through auditory brainstem responses (ABRs)-confirmed the reduction in cochlear 88 gain for actively attended vocoded speech, but not speech-in-noise. Conversely, neural activity 89 generated by the auditory midbrain was significantly increased in active vs. passive listening for 90 speech in babble and speech-shaped noise, but not for vocoded speech. This increase was 91 associated with elevated cortical markers of listening effort for the speech-in-noise conditions. A 92 model of the auditory periphery including an MOC circuit with biophysically-realistic temporal 93 dynamics confirmed the stimulus-dependent role of the MOCR in enhancing neural coding of 94 speech signals. Our data suggest that otherwise identical performance in active listening tasks may 95 invoke quite different efferent circuits, requiring different levels of listening effort, depending on the 96 type of stimulus degradation experienced.
98
Results
99Iso-performance in three manipul...