The past decade has witnessed a renewed interest in cortical local field potentials (LFPs)--that is, extracellularly recorded potentials with frequencies of up to ~500 Hz. This is due to both the advent of multielectrodes, which has enabled recording of LFPs at tens to hundreds of sites simultaneously, and the insight that LFPs offer a unique window into key integrative synaptic processes in cortical populations. However, owing to its numerous potential neural sources, the LFP is more difficult to interpret than are spikes. Careful mathematical modelling and analysis are needed to take full advantage of the opportunities that this signal offers in understanding signal processing in cortical circuits and, ultimately, the neural basis of perception and cognition.
Several neural codes have been proposed in order to explain how neurons encode sensory information. Here we tested the hypothesis that different codes might be employed concurrently and provide complementary stimulus information. Quantifying the information encoded about natural sounds in the auditory cortex of alert animals, we found that temporal spike-train patterns and spatial populations were both highly informative. However, the relative phase of slow ongoing rhythms at which these (temporal or population) responses occurred provided much additional and complementary information. Such nested codes combining spike-train patterns with the phase of firing were not only most informative, but also most robust to sensory noise added to the stimulus. Our findings suggest that processing in sensory cortices could rely on the concurrent use of several codes that combine information across different spatiotemporal scales. In addition, they propose a role of slow cortical rhythms in stabilizing sensory representations by reducing effects of noise.
Our brain integrates the information provided by the different sensory modalities into a coherent percept, and recent studies suggest that this process is not restricted to higher association areas. Here we evaluate the hypothesis that auditory cortical fields are involved in cross-modal processing by probing individual neurons for audiovisual interactions. We find that visual stimuli modulate auditory processing both at the level of field potentials and single-unit activity and already in primary and secondary auditory fields. These interactions strongly depend on a stimulus' efficacy in driving the neurons but occur independently of stimulus category and for naturalistic as well as artificial stimuli. In addition, interactions are sensitive to the relative timing of audiovisual stimuli and are strongest when visual stimuli lead by 20-80 msec. Exploring the underlying mechanisms, we find that enhancement correlates with the resetting of slow (approximately 10 Hz) oscillations to a phase angle of optimal excitability. These results demonstrate that visual stimuli can modulate the firing of neurons in auditory cortex in a manner that depends on stimulus efficacy and timing. These neurons thus meet the criteria for sensory integration and provide the auditory modality with multisensory contextual information about co-occurring environmental events.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.