Language tasks require the coordinated activation of multiple subnetworks—groups of related cortical interactions involved in specific components of task processing. Although electrocorticography (ECoG) has sufficient temporal and spatial resolution to capture the dynamics of event-related interactions between cortical sites, it is difficult to decompose these complex spatiotemporal patterns into functionally discrete subnetworks without explicit knowledge of each subnetwork’s timing. We hypothesized that subnetworks corresponding to distinct components of task-related processing could be identified as groups of interactions with co-varying strengths. In this study, five subjects implanted with ECoG grids over language areas performed word repetition and picture naming. We estimated the interaction strength between each pair of electrodes during each task using a time-varying dynamic Bayesian network (tvDBN) model constructed from the power of high gamma (70–110 Hz) activity, a surrogate for population firing rates. We then reduced the dimensionality of this model using principal component analysis (PCA) to identify groups of interactions with co-varying strengths, which we term functional network components (FNCs). This data-driven technique estimates both the weight of each interaction’s contribution to a particular subnetwork, and the temporal profile of each subnetwork’s activation during the task. We found FNCs with temporal and anatomical features consistent with articulatory preparation in both tasks, and with auditory and visual processing in the word repetition and picture naming tasks, respectively. These FNCs were highly consistent between subjects with similar electrode placement, and were robust enough to be characterized in single trials. Furthermore, the interaction patterns uncovered by FNC analysis correlated well with recent literature suggesting important functional-anatomical distinctions between processing external and self-produced speech. Our results demonstrate that subnetwork decomposition of event-related cortical interactions is a powerful paradigm for interpreting the rich dynamics of large-scale, distributed cortical networks during human cognitive tasks.
Neural keyword spotting could form the basis of a speech brain-computer-interface for menu-navigation if it can be done with low latency and high specificity comparable to the “wake-word” functionality of modern voice-activated AI assistant technologies. This study investigated neural keyword spotting using motor representations of speech via invasively-recorded electrocorticographic signals as a proof-of-concept. Neural matched filters were created from monosyllabic consonant-vowel utterances: one keyword utterance, and 11 similar non-keyword utterances. These filters were used in an analog to the acoustic keyword spotting problem, applied for the first time to neural data. The filter templates were cross-correlated with the neural signal, capturing temporal dynamics of neural activation across cortical sites. Neural vocal activity detection (VAD) was used to identify utterance times and a discriminative classifier was used to determine if these utterances were the keyword or non-keyword speech. Model performance appeared to be highly related to electrode placement and spatial density. Vowel height (/a/ vs /i/) was poorly discriminated in recordings from sensorimotor cortex, but was highly discriminable using neural features from superior temporal gyrus during self-monitoring. The best performing neural keyword detection (5 keyword detections with two false-positives across 60 utterances) and neural VAD (100% sensitivity, ~1 false detection per 10 utterances) came from high-density (2 mm electrode diameter and 5 mm pitch) recordings from ventral sensorimotor cortex, suggesting the spatial fidelity and extent of high-density ECoG arrays may be sufficient for the purpose of speech brain-computer-interfaces.
Any given area in human cortex may receive input from multiple, functionally heterogeneous areas, potentially representing different processing threads. Alpha (8-13 Hz) and beta oscillations (13-20 Hz) have been hypothesized by other investigators to gate local cortical processing, but their influence on cortical responses to input from other cortical areas is unknown. To study this, we measured the effect of local oscillatory power and phase on cortical responses elicited by single-pulse electrical stimulation (SPES) at distant cortical sites, in awake human subjects implanted with intracranial electrodes for epilepsy surgery. In 4 out of 5 subjects, the amplitudes of corticocortical evoked potentials (CCEPs) elicited by distant SPES were reproducibly modulated by the power, but not the phase, of local oscillations in alpha and beta frequencies. Specifically, CCEP amplitudes were higher when average oscillatory power just before distant SPES (-110 to -10 ms) was high. This effect was observed in only a subset (0-33%) of sites with CCEPs and, like the CCEPs themselves, varied with stimulation at different distant sites. Our results suggest that although alpha and beta oscillations may gate local processing, they may also enhance the responsiveness of cortex to input from distant cortical sites.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.