2016
DOI: 10.1186/s12984-016-0134-9
|View full text |Cite
|
Sign up to set email alerts
|

Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping

Abstract: BackgroundRecent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
49
0
1

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 86 publications
(50 citation statements)
references
References 23 publications
0
49
0
1
Order By: Relevance
“…Second, our findings provide additional clues with regard to the types of information available within human brain regions that could be exploited for the development of human neuromotor prosthetics that are sensitive to the wide variety of computations needed for dextrous hand actions (Aflalo et al, 2015;Andersen, Kellis, Klaes, & Aflalo, 2014;Collinger et al, 2013;Downey et al, 2016;Jarosiewicz et al, 2015).…”
Section: Discussionmentioning
confidence: 90%
“…Second, our findings provide additional clues with regard to the types of information available within human brain regions that could be exploited for the development of human neuromotor prosthetics that are sensitive to the wide variety of computations needed for dextrous hand actions (Aflalo et al, 2015;Andersen, Kellis, Klaes, & Aflalo, 2014;Collinger et al, 2013;Downey et al, 2016;Jarosiewicz et al, 2015).…”
Section: Discussionmentioning
confidence: 90%
“…Neurons were also found that were extremely specialized for specific behavioral actions; for example, we found units that became active for imagined movements of the hand to the mouth but would not become active for similar movements such as movement of the hand to the cheek or forehead. Such neural encoding of behaviorally meaningful actions opens the possibility that the high-level intent of the subject can be decoded and integrated with smart robotics (49)(50)(51) to perform complex movements that may otherwise require attentionally demanding moment-to-moment control of a robotic limb. To demonstrate this concept for PPC, we showed that participant EGS was able to grasp an object and move it to a new location with a robotic limb, which combined his timing of the intended movements with machine vision and smart robotic algorithms (50).…”
Section: First Implants Of Ppc In Humansmentioning
confidence: 99%
“…In this distal upper limb rehabilitation robot, the camera is mounted on the exoskeleton. This is different from other types of image-guided robot control devices, in which most of the cameras in other robots are placed externally in a fixed coordinate system [27,33]. This concept comes from a snake-eye view [34], in contrast to the human-eye view.…”
Section: Image-processing Algorithm For Recognizing User-intentmentioning
confidence: 98%
“…In this study, we attempted to apply an image processing-based approach that can reflect user-intent. Visual compensation using camera images in the BMI control of the robotic arm has recently been introduced [27]. Bang et al [28] suggested an upper limb rehabilitation robot system for precision control by camera-based image processing.…”
Section: Introductionmentioning
confidence: 99%