2021
DOI: 10.3390/app11156983
|View full text |Cite
|
Sign up to set email alerts
|

An Ensemble Feature Selection Approach to Identify Relevant Features from EEG Signals

Abstract: Identifying relevant data to support the automatic analysis of electroencephalograms (EEG) has become a challenge. Although there are many proposals to support the diagnosis of neurological pathologies, the current challenge is to improve the reliability of the tools to classify or detect abnormalities. In this study, we used an ensemble feature selection approach to integrate the advantages of several feature selection algorithms to improve the identification of the characteristics with high power of differen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
2

Relationship

2
7

Authors

Journals

citations
Cited by 13 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…These methods aim to improve model performance by reducing dimensionality, improving interpretability, and mitigating overfitting. Common approaches include filtering methods [46], which evaluate characteristics independently of the learning algorithm; encapsulation methods [47], which use the performance of the learning algorithm as a feature selection criterion; and embedded methods [48], where feature selection is integrated into the model building process itself. Each method offers distinct advantages and trade-offs, depending on factors such as the size of the dataset, dimensionality, and computing resources.…”
Section: Feature Selectionmentioning
confidence: 99%
“…These methods aim to improve model performance by reducing dimensionality, improving interpretability, and mitigating overfitting. Common approaches include filtering methods [46], which evaluate characteristics independently of the learning algorithm; encapsulation methods [47], which use the performance of the learning algorithm as a feature selection criterion; and embedded methods [48], where feature selection is integrated into the model building process itself. Each method offers distinct advantages and trade-offs, depending on factors such as the size of the dataset, dimensionality, and computing resources.…”
Section: Feature Selectionmentioning
confidence: 99%
“…This approach includes techniques such as decision trees [31], random forests [32], and multinomial logistic regression [33], which inherently evaluate the importance of features during model construction. For example, Maritza et al [34] adopted an embedded feature selection method to improve the identification of relevant features in EEG signals and to enhance the performance of the classifier in detecting anomalies, achieving high accuracy and stability. Albaqami et al [35] proposed a feature dimensionality reduction algorithm and adopted gradient boosting decision tree for feature extraction and classification.…”
Section: Related Workmentioning
confidence: 99%
“…To perform this work and others related [5,18,[29][30][31], 100 pediatric EEGs were collected from the same number of children aged between 22 days and 17 years old, suspected of suffering epilepsy. The exams were performed with the patients asleep, on the recommendation of the neurologist.…”
Section: Data Collection: Acquisition Of Encephalogramsmentioning
confidence: 99%