For many of our senses, the role of the cerebral cortex in detecting stimuli is controversial. Here we examine the effects of both acute and chronic inactivation of the primary somatosensory cortex in mice trained to move their large facial whiskers to detect an object by touch and respond with a lever to obtain a water reward. Using transgenic mice, we expressed inhibitory opsins in excitatory cortical neurons. Transient optogenetic inactivation of the primary somatosensory cortex, as well as permanent lesions, initially produced both movement and sensory deficits that impaired detection behaviour, demonstrating the link between sensory and motor systems during active sensing. Unexpectedly, lesioned mice had recovered full behavioural capabilities by the subsequent session. This rapid recovery was experience-dependent, and early re-exposure to the task after lesioning facilitated recovery. Furthermore, ablation of the primary somatosensory cortex before learning did not affect task acquisition. This combined optogenetic and lesion approach suggests that manipulations of the sensory cortex may be only temporarily disruptive to other brain structures that are themselves capable of coordinating multiple, arbitrary movements with sensation. Thus, the somatosensory cortex may be dispensable for active detection of objects in the environment.
Animals can selectively respond to a target sound despite simultaneous distractors, just as humans can respond to one voice at a crowded cocktail party. To investigate the underlying neural mechanisms, we recorded single-unit activity in primary auditory cortex (A1) and medial prefrontal cortex (mPFC) of rats selectively responding to a target sound from a mixture. We found that prestimulus activity in mPFC encoded the selection rule-which sound from the mixture the rat should select. Moreover, electrically disrupting mPFC significantly impaired performance. Surprisingly, prestimulus activity in A1 also encoded selection rule, a cognitive variable typically considered the domain of prefrontal regions. Prestimulus changes correlated with stimulus-evoked changes, but stimulus tuning was not strongly affected. We suggest a model in which anticipatory activation of a specific network of neurons underlies the selection of a sound from a mixture, giving rise to robust and widespread rule encoding in both brain regions.
Neuroscientists use many different software tools to acquire, analyze and visualize electrophysiological signals. However, incompatible data models and file formats make it difficult to exchange data between these tools. This reduces scientific productivity, renders potentially useful analysis methods inaccessible and impedes collaboration between labs. A common representation of the core data would improve interoperability and facilitate data-sharing. To that end, we propose here a language-independent object model, named “Neo,” suitable for representing data acquired from electroencephalographic, intracellular, or extracellular recordings, or generated from simulations. As a concrete instantiation of this object model we have developed an open source implementation in the Python programming language. In addition to representing electrophysiology data in memory for the purposes of analysis and visualization, the Python implementation provides a set of input/output (IO) modules for reading/writing the data from/to a variety of commonly used file formats. Support is included for formats produced by most of the major manufacturers of electrophysiology recording equipment and also for more generic formats such as MATLAB. Data representation and data analysis are conceptually separate: it is easier to write robust analysis code if it is focused on analysis and relies on an underlying package to handle data representation. For that reason, and also to be as lightweight as possible, the Neo object model and the associated Python package are deliberately limited to representation of data, with no functions for data analysis or visualization. Software for neurophysiology data analysis and visualization built on top of Neo automatically gains the benefits of interoperability, easier data sharing and automatic format conversion; there is already a burgeoning ecosystem of such tools. We intend that Neo should become the standard basis for Python tools in neurophysiology.
Neurons recorded in behaving animals often do not discernibly respond to sensory input and are not overtly task-modulated. These non-classically responsive neurons are difficult to interpret and are typically neglected from analysis, confounding attempts to connect neural activity to perception and behavior. Here, we describe a trial-by-trial, spike-timing-based algorithm to reveal the coding capacities of these neurons in auditory and frontal cortex of behaving rats. Classically responsive and non-classically responsive cells contained significant information about sensory stimuli and behavioral decisions. Stimulus category was more accurately represented in frontal cortex than auditory cortex, via ensembles of non-classically responsive cells coordinating the behavioral meaning of spike timings on correct but not error trials. This unbiased approach allows the contribution of all recorded neurons – particularly those without obvious task-related, trial-averaged firing rate modulation – to be assessed for behavioral relevance on single trials.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.