2016
DOI: 10.1073/pnas.1522615113
|View full text |Cite
|
Sign up to set email alerts
|

Early multisensory integration of self and source motion in the auditory system

Abstract: Discriminating external from self-produced sensory inputs is a major challenge for brains. In the auditory system, sound localization must account for movements of the head and ears, a computation likely to involve multimodal integration. Principal neurons (PNs) of the dorsal cochlear nucleus (DCN) are known to be spatially selective and to receive multimodal sensory information. We studied the responses of PNs to body rotation with or without sound stimulation, as well as to sound source rotation with station… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
20
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 20 publications
(20 citation statements)
references
References 36 publications
0
20
0
Order By: Relevance
“…Furthermore, coordinate transformations occur elsewhere in the auditory system [36,37] and behavioral movements can influence auditory subcortical and cortical processing [27,28]. Perhaps most importantly, vestibular signals are integrated into auditory processing already at the level of the cochlea nucleus [38] allowing the distinction between self and source motion [22]. Auditoryvestibular integration, together with visual, proprioceptive and motor corollary discharge systems, provides a mechanism through which changes in head direction can partially offset changes in acoustic input during movement to create allocentric representations.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Furthermore, coordinate transformations occur elsewhere in the auditory system [36,37] and behavioral movements can influence auditory subcortical and cortical processing [27,28]. Perhaps most importantly, vestibular signals are integrated into auditory processing already at the level of the cochlea nucleus [38] allowing the distinction between self and source motion [22]. Auditoryvestibular integration, together with visual, proprioceptive and motor corollary discharge systems, provides a mechanism through which changes in head direction can partially offset changes in acoustic input during movement to create allocentric representations.…”
Section: Discussionmentioning
confidence: 99%
“…While it has been largely assumed cortical neurons represent sound location relative to the head, the spatial coordinate frame in which location is encoded remains to be demonstrated. Furthermore, though the acoustic cues to sound localisation are explicitly head-centered, information about head direction necessary to form a world-centered representation is present at early levels of the ascending auditory system [22]. Thus it may be possible for neurons in the auditory system to represent space in an allocentric, world-centered coordinate frame that would preserve sound location across self-generated movement.…”
Section: Introductionmentioning
confidence: 99%
“…However, in the past decade, anatomical (Huang et al, 2013) and physiological evidence (Proville et al, 2014) has accumulated for mixed representations generated through multimodal integration in single granule cells of the cerebellar cortex (Arenz et al, 2008;Chabrol et al, 2015;Ishikawa et al, 2015). Multimodal integration has also been observed in granule cells of the mammalian auditory system (Wigderson et al, 2016) and electrosensory system of weakly electric fish (Sawtell, 2010), which have a ''cerebellar-like'' circuitry (Oertel and Young, 2004;Bell et al, 2008;Kennedy et al, 2014;Singla et al, 2017). Mixed representations in the dentate gyrus are harder to ascertain, because entorhinal cortical neurons already integrate multisensory cues to generate spatial representations (Campbell et al, 2018).…”
Section: Updating Classical Concepts Of Pattern Separation Expansion mentioning
confidence: 99%
“…To control for possible alterations to fusiform cell firing rates by changing body position (Wigderson et al . ), which may obscure I ‐width calculation, we recorded rate‐level functions for broad‐band noise (BBN) stimulation during each condition (Fig. D ) and found that tilt change did not affect fusiform cell spontaneous or evoked firing rates (two‐way ANOVA, F (17,2) = 0.26, P = 0.77).…”
Section: Resultsmentioning
confidence: 99%
“…; Koehler & Shore, ; Wigderson et al . ), a second‐order station for coding sound location (May, ; Young & Davis, ). But how multisensory information functionally influences brainstem sound localization processing has not been elucidated.…”
Section: Introductionmentioning
confidence: 99%