2022
DOI: 10.3390/s22135000
|View full text |Cite
|
Sign up to set email alerts
|

Cross-Platform Implementation of an SSVEP-Based BCI for the Control of a 6-DOF Robotic Arm

Abstract: Robotics has been successfully applied in the design of collaborative robots for assistance to people with motor disabilities. However, man-machine interaction is difficult for those who suffer severe motor disabilities. The aim of this study was to test the feasibility of a low-cost robotic arm control system with an EEG-based brain-computer interface (BCI). The BCI system relays on the Steady State Visually Evoked Potentials (SSVEP) paradigm. A cross-platform application was obtained in C++. This C++ platfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 15 publications
(10 citation statements)
references
References 128 publications
0
10
0
Order By: Relevance
“…In these studies, the interaction operations mapped the residual limb movement to robot instructions, such as forward, back, left, right, rotation, or other cartesian motions for the robotic arm, and some preset simple household tasks, which could help users perform some structured tasks, but still needing frequent limb movement. In addition, using biological signals remote control robots to perform the tasks would substantially reduce limb movements and physical burden [10][11][12][13]. However, the time-consuming signal recognition, complex operation, and high cost limit their development.…”
Section: Introductionmentioning
confidence: 99%
“…In these studies, the interaction operations mapped the residual limb movement to robot instructions, such as forward, back, left, right, rotation, or other cartesian motions for the robotic arm, and some preset simple household tasks, which could help users perform some structured tasks, but still needing frequent limb movement. In addition, using biological signals remote control robots to perform the tasks would substantially reduce limb movements and physical burden [10][11][12][13]. However, the time-consuming signal recognition, complex operation, and high cost limit their development.…”
Section: Introductionmentioning
confidence: 99%
“…A brain–computer interface is a communication system between a human and a computer [ 1 ] that allows one to send information or commands from the human brain to the outside world without the need for peripheral neural and muscular activity. Brain–computer interfaces (BCIs) help users express thoughts and control external devices by monitoring their brain activity [ 2 , 3 ]. A variety of methods for monitoring brain activity can be used to acquire brain signals in BCI systems, including electroencephalography (EEG), functional near-infrared spectroscopy [ 4 , 5 , 6 ], functional magnetic resonance imaging [ 7 ], and magnetoencephalography [ 8 ].…”
Section: Introductionmentioning
confidence: 99%
“…For instance, workers doing physical tasks with both hands in the industrial workspace would need the capability of hands-free teleoperation to conduct additional tasks simultaneously (Liu et al, 2021;Škulj et al, 2021). In addition to industrial purposes, it can be used to assist patients suffering from motor disabilities in an upper limb due to stroke or spinal cord injury (Meng et al, 2016;Chen et al, 2018;Quiles et al, 2022;Zhou et al, 2023). As grasping is an essential task to perform activities of daily living (ADL) (Roy et al, 2017), teleoperation of robotic grasping without upper limb motion can help these patients in fulfilling basic skills in everyday life.…”
Section: Introductionmentioning
confidence: 99%
“…Several previous studies utilized the SSVEP-BCI to achieve EEG-controlled robotic arm systems and their grasping capabilities, as illustrated in Table 1. The subject could move the end-effector not only within a two-dimensional (2D) plane (Cao et al, 2021) but also in three-dimensional (3D) space (Chen et al, 2018;Zhu et al, 2020;Peng et al, 2022;Quiles et al, 2022;Zhou et al, 2023) by using six or more flickering blocks. The target object for grasping also could be selected by linking each flickering block to a placed object (Yang et al, 2017;Chen et al, 2019;Li and Kesavadas, 2022).…”
Section: Introductionmentioning
confidence: 99%