2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society 2012
DOI: 10.1109/embc.2012.6346317
|View full text |Cite
|
Sign up to set email alerts
|

Directed causality of the human electrocorticogram during dexterous movement

Abstract: While significant strides have been made in designing brain-machine interfaces for use in humans, efforts to decode truly dexterous movements in real time have been hindered by difficulty extracting detailed movement-related information from the most practical human neural interface, the electrocorticogram (ECoG). We explore a potentially rich, largely untapped source of movement-related information in the form of cortical connectivity computed with time-varying dynamic Bayesian networks (TV-DBN). We discover … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
1
0

Year Published

2014
2014
2016
2016

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 12 publications
1
1
0
Order By: Relevance
“…This difference in subnetwork dynamics—between intra-pSTG FNCs that are preferential for the external speech cue, and FNCs with SMC–Wernicke’s area interactions with stronger activation during the subject’s own verbal response—is consistent with recent studies suggesting functional distinctions between processing external and self-generated speech (Towle et al, 2008; Chang et al, 2013; Kort et al, 2014). The inclusion of SMC-Wernicke interactions in these FNCs, as well as in the red response-aligned FNCs from picture naming, is consistent with previous studies in nonhuman primates showing motor-sensory interactions during vocalization (Eliades and Wang, 2003); these interactions may reflect auditory feedback for motor control (Houde and Chang, 2015) and/or feedforward motor inputs for auditory state prediction (Hickok, 2012). Interestingly, subject 5’s SMC-Wernicke FNCs (red and pink in Figure 4) involved distinct sites superior to those implicated in her cue-preferential intra-pSTG FNC (green); this correlates well with predictions of a spatially distinct “sensorimotor interface” for speech (Hickok et al, 2003; Hickok and Poeppel, 2007; Hickok et al, 2009).…”
Section: Discussionsupporting
confidence: 87%
See 1 more Smart Citation
“…This difference in subnetwork dynamics—between intra-pSTG FNCs that are preferential for the external speech cue, and FNCs with SMC–Wernicke’s area interactions with stronger activation during the subject’s own verbal response—is consistent with recent studies suggesting functional distinctions between processing external and self-generated speech (Towle et al, 2008; Chang et al, 2013; Kort et al, 2014). The inclusion of SMC-Wernicke interactions in these FNCs, as well as in the red response-aligned FNCs from picture naming, is consistent with previous studies in nonhuman primates showing motor-sensory interactions during vocalization (Eliades and Wang, 2003); these interactions may reflect auditory feedback for motor control (Houde and Chang, 2015) and/or feedforward motor inputs for auditory state prediction (Hickok, 2012). Interestingly, subject 5’s SMC-Wernicke FNCs (red and pink in Figure 4) involved distinct sites superior to those implicated in her cue-preferential intra-pSTG FNC (green); this correlates well with predictions of a spatially distinct “sensorimotor interface” for speech (Hickok et al, 2003; Hickok and Poeppel, 2007; Hickok et al, 2009).…”
Section: Discussionsupporting
confidence: 87%
“…Previous studies using ERI estimates found that many of the interactions between ECoG sites tend to modulate together over the course of a task (Korzeniewska et al, 2008, 2011; Benz et al, 2012a). We therefore hypothesized that subnetworks corresponding to distinct computational stages could be identified as groups of interactions with similarly timed task-related activation, which we term “function al network components” (FNCs).…”
Section: Introductionmentioning
confidence: 99%