2014
DOI: 10.3233/bme-141111
|View full text |Cite
|
Sign up to set email alerts
|

Hybrid Brain-Computer Interface (BCI) based on the EEG and EOG signals

Abstract: Recently, the integration of different electrophysiological signals into an electroencephalogram (EEG) has become an effective approach to improve the practicality of brain-computer interface (BCI) systems, referred to as hybrid BCIs. In this paper, a hybrid BCI was designed by combining an EEG with electrocardiograph (EOG) signals and tested using a target selection experiment. Gaze direction from the EOG and the event-related (de)synchronization (ERD/ERS) induced by motor imagery from the EEG were simultaneo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(18 citation statements)
references
References 20 publications
0
18
0
Order By: Relevance
“…EOG and EMG have also been combined with EEG to improve mental task classification using Fisher discriminant analysis (Zhang et al, 2010). Another study of Jiang et al (2014) has shown that features selected based on different eye movement and gaze signals led to 89.3% accuracy using LDA as a classifier. Real-time video game control of Belkacem et al (2015a) and exoskeleton control of Witkowski et al (2014) have also been implemented in the form of hBCI using a thresholding scheme with the accuracies of 77.3 (for six commands) and 63.59% (for four commands), respectively.…”
Section: Hardware Combinationmentioning
confidence: 99%
See 2 more Smart Citations
“…EOG and EMG have also been combined with EEG to improve mental task classification using Fisher discriminant analysis (Zhang et al, 2010). Another study of Jiang et al (2014) has shown that features selected based on different eye movement and gaze signals led to 89.3% accuracy using LDA as a classifier. Real-time video game control of Belkacem et al (2015a) and exoskeleton control of Witkowski et al (2014) have also been implemented in the form of hBCI using a thresholding scheme with the accuracies of 77.3 (for six commands) and 63.59% (for four commands), respectively.…”
Section: Hardware Combinationmentioning
confidence: 99%
“…We discuss EOG and eye tracker-based studies together, as both use eye movements for classification. For command generation, signals are decoded simultaneously, and for control of a BCI system, they are fused using a combined classifier (Jiang et al, 2014). Although EOG is used to remove ocular artifacts from EEG data (Li et al, 2015), drowsiness detection (Khushaba et al, 2011) and wheelchair control (Ramli et al, 2015) are also among the most common applications of EEG–EOG-based systems.…”
Section: Hardware Combinationmentioning
confidence: 99%
See 1 more Smart Citation
“…In this case, multiple input signals from different systems can be fed into one classification algorithm, or each decision can be fused to make one final decision. Yin et al [113] utilized both SSVEP and visual P300 simultaneously to increase classification accuracy and ITR of a BCI speller, while Jiang et al [131] fused MI features from EEG signals and gaze directions from EOG signals to improve the BCI performance for a multi-class target selection. It is also possible that one system can initiate the other system as a switch by detecting a distinct signal.…”
Section: Study 1: Taxonomy Of Hybrid Bcismentioning
confidence: 99%
“…To overcome the limitation of using a single MI paradigm, many excellent works have been established in recent years to realize multidimensional control of external devices by combining the MI with other EEG modalities (Rebsamen et al, 2010; Long et al, 2012; Li et al, 2013; Bhattacharyya et al, 2014; Ma et al, 2017) or other bioelectrical signals (Punsawad et al, 2010; Jun et al, 2014; Witkowski et al, 2014; Ma et al, 2015; Soekadar et al, 2015; Minati et al, 2016), i.e., using a hybrid brain-computer interfaces (hBCIs) (Pfurtscheller et al, 2010; Hong and Khan, 2017). For example, in Long et al (2012) and Li et al (2013) the user continuously controlled the direction (left/right turn) of a wheelchair using the left- or right- imagery, and used the P300 potential and SSVEP to generate discrete commands, such as acceleration/deceleration and stopping; in Ma et al (2017), the users generated MI to control the moving of a robotic arm, and stop it by detecting the P300 potential.…”
Section: Introductionmentioning
confidence: 99%