2018
DOI: 10.3389/fnins.2018.00943
|View full text |Cite
|
Sign up to set email alerts
|

An Approach for Brain-Controlled Prostheses Based on a Facial Expression Paradigm

Abstract: One of the most exciting areas of rehabilitation research is brain-controlled prostheses, which translate electroencephalography (EEG) signals into control commands that operate prostheses. However, the existing brain-control methods have an obstacle between the selection of brain computer interface (BCI) and its performance. In this paper, a novel BCI system based on a facial expression paradigm is proposed to control prostheses that uses the characteristics of theta and alpha rhythms of the prefrontal and mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
28
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

4
4

Authors

Journals

citations
Cited by 19 publications
(30 citation statements)
references
References 77 publications
(93 reference statements)
2
28
0
Order By: Relevance
“…As for the location of the EEG generator, since the precise positioning cannot be achieved without the extra professional equipment, the dipole was set in accordance with the mechanism. In our serial research on ME-BCI (Zhang et al, 2016(Zhang et al, , 2021bLi et al, 2018b;Lu et al, 2018aLu et al, ,b, 2020, data-driven brain connectivity analysis demonstrated the main involvement of the motor cortex (Lu et al, 2018a;Zhang et al, 2021b), which conformed to the contralateral control facts. Meanwhile, evidence showed the frontal lobe and limbic system also participate in facial-expression processing (Price and Drevets, 2010;Li et al, 2018b;Lu et al, 2018b).…”
Section: The Head Models and The Dipole Positionsupporting
confidence: 63%
“…As for the location of the EEG generator, since the precise positioning cannot be achieved without the extra professional equipment, the dipole was set in accordance with the mechanism. In our serial research on ME-BCI (Zhang et al, 2016(Zhang et al, , 2021bLi et al, 2018b;Lu et al, 2018aLu et al, ,b, 2020, data-driven brain connectivity analysis demonstrated the main involvement of the motor cortex (Lu et al, 2018a;Zhang et al, 2021b), which conformed to the contralateral control facts. Meanwhile, evidence showed the frontal lobe and limbic system also participate in facial-expression processing (Price and Drevets, 2010;Li et al, 2018b;Lu et al, 2018b).…”
Section: The Head Models and The Dipole Positionsupporting
confidence: 63%
“…The research on the relationship between emotional processing and facial action has increased. Previous studies proved that brain activity from the prefrontal and motor cortexes provides a biological foundation to distinguish the movements of facial actions [ 11 , 14 , 24 , 29 , 32 , 33 ]. For expected facial muscle contraction, many factors contribute to the mechanism of a person’s facial action.…”
Section: Methodsmentioning
confidence: 99%
“…The mechanism of the EMG-based facial action and its control model was analyzed from the brain responses of the cortexes particular to facial muscle contractions. Our previous study demonstrated that the prefrontal, motor, and limbic cortexes have a fundamental function in the completion of different facial actions [24]. Hence, the bio-signals from different facial muscles contain abundant and sophisticated information during their actions.…”
Section: Mechanism Of Emg-based Facial Actionmentioning
confidence: 99%
“…For example, when only using EEG, it is difficult to achieve a satisfactory accuracy, and when only using EMG it is hard to guarantee the stability of recognition. Rui et al classified the facial action to control a prosthesis, and the results showed that the performance of EMG-based control is better than EEG-based control (Rui et al, 2018b;Xiaodong et al, 2020). Therefore, the fusion method of EEG and EMG signals emerge as the times require, it can improve decoding performance and stability (Tejedor et al, 2019).…”
Section: Introductionmentioning
confidence: 99%