Bodily rhythms such as respiration are increasingly acknowledged to modulate neural oscillations underlying human action, perception, and cognition. Conversely, the link between respiration and aperiodic brain activity - a non-oscillatory reflection of excitation-inhibition (E:I) balance - has remained unstudied. Aiming to disentangle potential respiration-related dynamics of periodic and aperiodic activity, we applied novel algorithms of time-resolved parameter estimation to resting-state M/EEG data from two recording sites (N = 78). Our findings highlight the role of respiration as a physiological influence on brain signalling. We provide first evidence that fluctuations of aperiodic brain activity (1/f slope) are phase-locked to the respiratory cycle which strongly suggests that spontaneous state shifts of excitation-inhibition balance are at least partly influenced by peripheral bodily signals. Moreover, differential temporal dynamics in their coupling to non-oscillatory and oscillatory activity point towards a functional distinction in the way each component is related to respiration.
When we attentively listen to an individual’s speech, our brain activity dynamically aligns to the incoming acoustic input at multiple timescales. Although this systematic alignment between ongoing brain activity and speech in auditory brain areas is well established, the acoustic events that drive this phase-locking are not fully understood. Here, we use magnetoencephalographic recordings of 24 human participants (12 females) while they were listening to a 1 h story. We show that whereas speech–brain coupling is associated with sustained acoustic fluctuations in the speech envelope in the theta-frequency range (4–7 Hz), speech tracking in the low-frequency delta (below 1 Hz) was strongest around onsets of speech, like the beginning of a sentence. Crucially, delta tracking in bilateral auditory areas was not sustained after onsets, proposing a delta tracking during continuous speech perception that is driven by speech onsets. We conclude that both onsets and sustained components of speech contribute differentially to speech tracking in delta- and theta-frequency bands, orchestrating sampling of continuous speech. Thus, our results suggest a temporal dissociation of acoustically driven oscillatory activity in auditory areas during speech tracking, providing valuable implications for orchestration of speech tracking at multiple time scales.
Speech production and perception are fundamental processes of human cognition that both rely on an internal forward model that is still poorly understood. Here, we study this forward model by using Magnetoencephalography (MEG) to comprehensively map connectivity of regional brain activity within the brain and to the speech envelope during continuous speaking and listening. Our results reveal a partly shared neural substrate for both processes but also a dissociation in space, delay and frequency. Neural activity in motor and frontal areas is coupled to succeeding speech in delta band (1-3 Hz), whereas coupling in the theta range follows speech in temporal areas during speaking. Neural connectivity results showed a separation of bottom-up and top-down signalling in distinct frequency bands during speaking. Here, we show that frequency-specific connectivity channels for bottom-up and top-down signalling support continuous speaking and listening in a way that is consistent with the predictive coding framework.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.