2017
DOI: 10.1101/098798
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Integration of visual information in auditory cortex promotes auditory scene analysis through multisensory binding

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

2
36
3

Year Published

2018
2018
2021
2021

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 25 publications
(41 citation statements)
references
References 58 publications
2
36
3
Order By: Relevance
“…This effect is principally driven by an improvement in the ability of listeners to exploit 26 temporal coherence in the masker--coherent condition. We have demonstrated that the enhancement of 27 one sound in a mixture by a temporally coherent visual stimulus is a stimulus driven, attention 28 independent, bottom--up effect supported by the early integration of auditory and visual information in 29 auditory cortex (Atilgan et al, 2018). In keeping with our behavioral data from naïve listeners, such an 1 enhancement seems likely to facilitate selective attention when the temporally coherent stream is a 2 target, and impair it when that sound is a distractor.…”
Section: Discussion 23supporting
confidence: 63%
“…This effect is principally driven by an improvement in the ability of listeners to exploit 26 temporal coherence in the masker--coherent condition. We have demonstrated that the enhancement of 27 one sound in a mixture by a temporally coherent visual stimulus is a stimulus driven, attention 28 independent, bottom--up effect supported by the early integration of auditory and visual information in 29 auditory cortex (Atilgan et al, 2018). In keeping with our behavioral data from naïve listeners, such an 1 enhancement seems likely to facilitate selective attention when the temporally coherent stream is a 2 target, and impair it when that sound is a distractor.…”
Section: Discussion 23supporting
confidence: 63%
“…To simulate the effects of cooling on responses of individual units, we built a simple model mapping suppression of spiking activity to reduction in cortical temperature using data obtained from simultaneous cooling and single unit recording in Suprasylvian cortex (SSY; higher order visual cortex) (Wood, Town et al 2017, Atilgan, Town et al 2018. Figure 3B illustrates the suppression of responses of one SSY unit to visual stimuli when cooling was applied.…”
Section: Population Decoding Offers a Mechanism For Effects Of Coolingmentioning
confidence: 99%
“…Neural activity was recorded with 16-channel, single shank silicon probe (A1x16, Neuronexus Technologies, Ann Arbor, MI) placed at the center of a cooling loop, which was itself placed on the suprasylvian cortex, in the region posterior to auditory cortex (See Atilgan, Town et al 2018 for further details). The design of the cooling loop system was similar to that designed for behaving experiments, with the exception that the loop was not embedded within dental cement during anesthetized experiments.…”
Section: Neural Recording During Coolingmentioning
confidence: 99%
See 1 more Smart Citation
“…The neural basis of multisensory integration and its loci in the hierarchy of brain 5 computations have been the focus of myriad of studies (see for reviews (Talsma et al, 2010;ten 6 Oever et al, 2016;Keil and Senkowski, 2018)). It is now well established that multisensory 7 integration starts early in the process chain Schroeder and Foxe, 8 2005): both animal and human studies have demonstrated that the genesis of multisensory 9 integration relies on cross-modal inputs to sensory cortices which informs about the 10 spatiotemporal co-occurrence of sensory cues (Bizley et al, 2007;Lakatos et al, 2007;Kayser 11 et al, 2008;Cappe et al, 2010;Mercier et al, 2013Mercier et al, , 2015Atilgan et al, 2018). At a later stage, 12 the integration of information from different modalities pertains to congruency and reliability of 13 multisensory inputs, as well as task relevance Noppeney, 2015, 2016;Kayser et al, 14 2017).…”
mentioning
confidence: 99%