Nonparametric spectral estimation is a widely used technique in many applications ranging from radar and seismic data analysis to electroencephalography (EEG) and speech processing. Among the techniques that are used to estimate the spectral representation of a system based on finite observations, multitaper spectral estimation has many important optimality properties, but is not as widely used as it possibly could be. We give a brief overview of the standard nonparametric spectral estimation theory and the multitaper spectral estimation, and give two examples from EEG analyses of anesthesia and sleep.
Highlights d Automatic image segmentation defines auditory cortex fields by temporal responses d Both onset and offset responses are tonotopically organized across auditory cortex d Parallel neuronal networks process tone onset and offset in the auditory cortex d A1 amplifies offset response by convergent thalamic input and intracortical processing
Sensory detection tasks enhance representations of behaviorally meaningful stimuli in primary auditory cortex (A1). However, it remains unclear how A1 encodes decision-making. Neurons in A1 layer 2/3 (L2/3) show heterogeneous stimulus selectivity and complex anatomical connectivity, and receive input from prefrontal cortex. Thus, task-related modulation of activity in A1 L2/3 might differ across subpopulations. To study the neural coding of decision-making, we used two-photon imaging in A1 L2/3 of mice performing a tone-detection task. Neural responses to targets showed attentional gain and encoded behavioral choice. To characterize network representation of behavioral choice, we analyzed functional connectivity using Granger causality, pairwise noise correlations, and neural decoding. During task performance, small groups of four to five neurons became sparsely linked, locally clustered, and rostro-caudally oriented, while noise correlations both increased and decreased. Our results suggest that sensory-based decision-making involves small neural networks driven by the sum of sensory input, attentional gain, and behavioral choice.
The underlying mechanism of how the human brain solves the cocktail party problem is largely unknown. Recent neuroimaging studies, however, suggest salient temporal correlations between the auditory neural response and the attended auditory object. Using magnetoencephalography (MEG) recordings of the neural responses of human subjects, we propose a decoding approach for tracking the attentional state while subjects are selectively listening to one of the two speech streams embedded in a competing-speaker environment. We develop a biophysically-inspired state-space model to account for the modulation of the neural response with respect to the attentional state of the listener. The constructed decoder is based on a maximum a posteriori (MAP) estimate of the state parameters via the Expectation Maximization (EM) algorithm. Using only the envelope of the two speech streams as covariates, the proposed decoder enables us to track the attentional state of the listener with a temporal resolution of the order of seconds, together with statistical confidence intervals. We evaluate the performance of the proposed model using numerical simulations and experimentally measured evoked MEG responses from the human brain. Our analysis reveals considerable performance gains provided by the state-space model in terms of temporal resolution, computational complexity and decoding accuracy.
Rhythmicoscillationsshapecorticaldynamicsduringactivebehavior,sleep,andgeneralanesthesia.Cross-frequencyphase-amplitudecoupling is a prominent feature of cortical oscillations, but its role in organizing conscious and unconscious brain states is poorly understood. Using high-density EEG and intracranial electrocorticography during gradual induction of propofol general anesthesia in humans, we discovered a rapid drug-induced transition between distinct states with opposite phase-amplitude coupling and different cortical source distributions. One state occurs during unconsciousness and may be similar to sleep slow oscillations. A second state occurs at the loss or recovery of consciousness and resembles an enhanced slow cortical potential. These results provide objective electrophysiological landmarks of distinct unconscious brain states, and could be used to help improve EEG-based monitoring for general anesthesia.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.