2006
DOI: 10.1088/1741-2560/3/2/010
|View full text |Cite
|
Sign up to set email alerts
|

Selection and parameterization of cortical neurons for neuroprosthetic control

Abstract: When designing neuroprosthetic interfaces for motor function, it is crucial to have a system that can extract reliable information from available neural signals and produce an output suitable for real life applications. Systems designed to date have relied on establishing a relationship between neural discharge patterns in motor cortical areas and limb movement, an approach not suitable for patients who require such implants but who are unable to provide proper motor behavior to initially tune the system. We d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
87
0

Year Published

2007
2007
2019
2019

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 84 publications
(92 citation statements)
references
References 50 publications
5
87
0
Order By: Relevance
“…In the current brain-driven interface prototypes, parameters of the observation model that relate neural activity to the user's intentions are learned by the algorithm during a training session consisting of real, imagined, or viewed actions (Carmena et al 2003;Leuthardt et al 2004;Musallam et al 2004;Serruya et al 2002;Shenoy et al 2003;Taylor et al 2002;Wahnoun et al 2006;Wolpaw and McFarland 2004). In practice, this facilitates rapid (also termed "instant") improvements in device performance because the device learns to match the user, whereas the user can maintain their operating assumptions about the device.…”
Section: Application 3: Adaptive Spike Filtering Under Ongoing Neuronmentioning
confidence: 99%
See 2 more Smart Citations
“…In the current brain-driven interface prototypes, parameters of the observation model that relate neural activity to the user's intentions are learned by the algorithm during a training session consisting of real, imagined, or viewed actions (Carmena et al 2003;Leuthardt et al 2004;Musallam et al 2004;Serruya et al 2002;Shenoy et al 2003;Taylor et al 2002;Wahnoun et al 2006;Wolpaw and McFarland 2004). In practice, this facilitates rapid (also termed "instant") improvements in device performance because the device learns to match the user, whereas the user can maintain their operating assumptions about the device.…”
Section: Application 3: Adaptive Spike Filtering Under Ongoing Neuronmentioning
confidence: 99%
“…Although neural activity parameters are commonly fit in prosthesis applications using separate intervals of training data (Carmena et al 2003;Hochberg et al 2006;Leuthardt et al 2004;Musallam et al 2004;Santhanam et al 2006;Serruya et al 2002;Taylor et al 2002;Wahnoun et al 2006;Wolpaw and McFarland 2004), these parameters can also be adjusted on the fly by simultaneously estimating the user's intentions and neural activity parameters (Eden et al 2004a;Eden et al 2004b). In a more modular "lock-step" implementation, the user's intentions are estimated, followed by refinements of the neural activity parameter estimates, but these actions are alternated at every time step.…”
Section: Application 3: Adaptive Spike Filtering Under Ongoing Neuronmentioning
confidence: 99%
See 1 more Smart Citation
“…Others have shown that real-time decoding parameters can be computed from neural recordings made while monkeys passively observe the corresponding movements (Wahnoun et al, 2006) or when paralyzed patients imagine tracking a moving cursor (Hochberg et al, 2006). Hatsopoulos and his colleagues have compared the performance of such observationbased decoders with that of movement-based decoders in providing real-time control.…”
Section: Visual Decodingmentioning
confidence: 99%
“…At the single-cell level, observation-related MI activity has only been used to set the initial parameters for motor decoding schemes (Hochberg et al, 2006;Wahnoun et al, 2006), but the origin, extent, and character of this activity has not been described previously.…”
Section: Introductionmentioning
confidence: 99%