2015 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 2015
DOI: 10.1109/bibm.2015.7359788
|View full text |Cite
|
Sign up to set email alerts
|

Systematic analysis of machine learning algorithms on EEG data for brain state intelligence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 17 publications
(8 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…Previous studies [24][25][26] have shown that EEG spectral features such as the band power coefficients are correlated to the cognitive workload. Therefore, this representation was used to train our recurrent model.…”
Section: Eeg Frequency Featuresmentioning
confidence: 99%
“…Previous studies [24][25][26] have shown that EEG spectral features such as the band power coefficients are correlated to the cognitive workload. Therefore, this representation was used to train our recurrent model.…”
Section: Eeg Frequency Featuresmentioning
confidence: 99%
“…Before the popularity of deep learning, the primary approaches for feature extraction mainly included time-frequency features extracted by signal analysis methods, such as power spectral density [42], bandpower [43], independent components [44], and differential entropy [45]. The widely researched pattern recognition and machine learning methods include artificial neural networks [46, 47], naive Bayes [48], support vector machines (SVM) [49, 50], etc. With the extensive application and in-depth promotion of deep learning, an ever-increasing number of brain science and neuroscience research teams are exploiting its strength in designing algorithms to achieve intelligent understanding and analysis of brain activities via EEGs, leading to propose an end-to-end model by integrating feature extraction and classification/clustering.…”
Section: Related Workmentioning
confidence: 99%
“…The extracted features are classified using three standard supervised classifiers, namely, support vector machine (SVM) (Mitchell 1997;Webb 2003;Zhang et al 2016), naive Bayes (NB) (Mitchell 1997;Leung 2007;Bhaduri et al 2016;Chan et al 2015) and k-nearest neighbor (kNN) (Mitchell 1997;Page et al 2015) independent of each other. Classification is performed in a hierarchial one-vs.one (OVO) approach and majority voting (Paul et al 2006) is used to decide the final outcome.…”
Section: Classificationmentioning
confidence: 99%