2022
DOI: 10.1101/2022.07.04.498676
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Executed and imagined grasping movements can be decoded from lower dimensional representation of distributed non-motor brain areas

Abstract: Using brain activity directly as input for assistive tool control can circumvent muscular dysfunction and increase functional independence for physically impaired people. Most invasive motor decoding studies focus on decoding neural signals from the primary motor cortex, which provides a rich but superficial and spatially local signal. Initial non-primary motor cortex decoding endeavors have used distributed recordings to demonstrate decoding of motor activity by grouping electrodes in mesoscale brain regions.… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1

Relationship

3
1

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 42 publications
0
5
0
Order By: Relevance
“…Subsequently, experiment initializes a recorder instance and adds all streams to the list of streams it should record. For a movement experiment [19][20][21][22][23], the streams recorded could be the neural amplifier and experimental triggers. Additionally, a movement tracker [24][25][26] or a force sensor [27] could be added.…”
Section: Setupmentioning
confidence: 99%
See 1 more Smart Citation
“…Subsequently, experiment initializes a recorder instance and adds all streams to the list of streams it should record. For a movement experiment [19][20][21][22][23], the streams recorded could be the neural amplifier and experimental triggers. Additionally, a movement tracker [24][25][26] or a force sensor [27] could be added.…”
Section: Setupmentioning
confidence: 99%
“…We see no indications of different data quality in our neural decoding endeavors. We can decode speech [44,45] and movement trajectories [46] with performance equal to that using our previous setup.…”
Section: Practical Experiencementioning
confidence: 99%
“…We see no indications of different data quality in our neural decoding endeavors. We can decode speech [44,45] and movement trajectories [46] with performance equal to that using our previous setup.…”
Section: Practical Experiencementioning
confidence: 99%
“…Subsequently, initializes a Recorder instance and adds all streams to the list of streams it should record. For a movement experiment ([12], [13], [14], [15], [16]), the streams recorded could be the neural amplifier, experimental triggers, and a movement tracker could be potentially added ([17], [18], [19]) or a force sensor ([20]. For speech perception ([21], [22], [23]) or auditory perception ([24], [25]) the audio stream, experiment triggers and neural data.…”
Section: T-rex Platformmentioning
confidence: 99%
“…This experiment is a simple text-based instruction for a grasping task (Figure 4A). The participant is prompted by text in a Python tkinter 5 window to continuously open and close either their left or right hand, as used in [13]. The experiment requires neural data as the input device and generates a StreamOutlet to send markers that inform about the start and end of the experiment and of the trials.…”
Section: Use Casesmentioning
confidence: 99%