2014 International Conference on Signal Processing and Communications (SPCOM) 2014
DOI: 10.1109/spcom.2014.6983926
|View full text |Cite
|
Sign up to set email alerts
|

A bacterial foraging optimization and learning automata based feature selection for motor imagery EEG classification

Abstract: Selection of relevant features is an open problem inBrain-computer interfacing (BCI) research. Sometimes, features extracted from brain signals are high dimensional which in turn affects the accuracy of the classifier. Selection of the most relevant features improves the performance of the classifier and reduces the computational cost of the system. In this study, we have used a combination of Bacterial Foraging Optimization and Learning Automata to determine the best subset of features from a given motor imag… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 23 publications
0
5
0
Order By: Relevance
“…Researchers in [58] proposed the Chaos FA algorithm for heart disease prediction; the Bacterial Memetic Algorithm based feature selection for surface EMG-based hand motion recognition in long-term use was suggested by [85]. In [84], the authors tackled FS in Brain-Computer Interfacing (BCI) research with the BFO combined algorithm with learning automata. The proposed algorithm selects a feature subset using discrete wavelength transform, which is applied to break down the EEG data.…”
mentioning
confidence: 99%
“…Researchers in [58] proposed the Chaos FA algorithm for heart disease prediction; the Bacterial Memetic Algorithm based feature selection for surface EMG-based hand motion recognition in long-term use was suggested by [85]. In [84], the authors tackled FS in Brain-Computer Interfacing (BCI) research with the BFO combined algorithm with learning automata. The proposed algorithm selects a feature subset using discrete wavelength transform, which is applied to break down the EEG data.…”
mentioning
confidence: 99%
“…• Wrapper method: selects the subset of features based on the performance of the features on the learning algorithm during the evaluation step. Examples involve using optimization techniques like GA with the objective of maximizing the cross validation accuracy (Bhattacharyya et al, 2014;Pal et al, 2014;Xu et al, 2014;Ramos et al, 2016;Baig et al, 2017;Liu et al, 2017;Ramos and Vellasco, 2018;Ghosh et al, 2019), classification error (Wang and Veluvolu, 2017), unsupervised classification (Kimovski et al, 2015), similarity score and clustering validity index (Bhattacharyya et al, 2013;Rakshit et al, 2013), or classifier parsimony Cîmpanu et al (2017). • Embedded method: feature selection is incorporated as a part of the model's training process.…”
Section: Optimizing the Problem Of Feature Selectionmentioning
confidence: 99%
“…• Wrapper method: selects the subset of features based on the performance of the features on the learning algorithm during the evaluation step. Examples involve using optimization techniques like GA with the objective of maximizing the cross validation accuracy (Bhattacharyya et al, 2014;Pal et al, 2014;Xu et al, 2014;Ramos et al, 2016;Ramos and Vellasco, 2018;Ghosh et al, 2019), classification error (Wang and Veluvolu, 2017), unsupervised classification (Kimovski et al, 2015), similarity score and clustering validity index (Bhattacharyya et al, 2013;Rakshit et al, 2013), or classifier parsimony Cîmpanu et al (2017). • Embedded method: feature selection is incorporated as a part of the model's training process.…”
Section: Optimizing the Problem Of Feature Selectionmentioning
confidence: 99%