2015
DOI: 10.1016/j.knosys.2015.04.015
|View full text |Cite
|
Sign up to set email alerts
|

Joint model for feature selection and parameter optimization coupled with classifier ensemble in chemical mention recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
15
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(15 citation statements)
references
References 21 publications
0
15
0
Order By: Relevance
“…Recently, artificial intelligence (AI) methodologies can be used to improve the available classification model. At the same time, the existence of several features in the high-dimensional medical data resulted in different problems such as overfitting, high computation complexity and low interoperability of the finishing model 7,8 . The easiest method to resolve the problem is to decrease the number of features using Feature selection (FS) approach.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, artificial intelligence (AI) methodologies can be used to improve the available classification model. At the same time, the existence of several features in the high-dimensional medical data resulted in different problems such as overfitting, high computation complexity and low interoperability of the finishing model 7,8 . The easiest method to resolve the problem is to decrease the number of features using Feature selection (FS) approach.…”
Section: Introductionmentioning
confidence: 99%
“…Another important difference is the preservation of the best solutions, through an elitist selection according to the fitness and spread of solutions. Ekbal and Saha (2015) applied NSGA-II to jointly optimize hyperparameters and features, and demonstrated the superiority of the resulting models over others constructed with all the features and default hyperparameters. Binder et al (2020) observed analogous results optimizing a SVM, kkNN, and XGBoost.…”
Section: Metaheuristic-based Hpo Algorithmsmentioning
confidence: 99%
“…The algorithm proposed by Martinez-de Pison et al (2017) combines HPO with feature selection. Unlike other algorithms (Ekbal & Saha, 2015;J. Guo et al, 2019;León, Ortega, & Ortiz, 2019), the proposed method considers feature selection combined with HPO.…”
Section: Hybrid Hpo Algorithmsmentioning
confidence: 99%
“…Therefore, wrapper methods usually have a better learning effect than filter methods [ 33 , 34 ]. Commonly used wrapper methods include genetic algorithms (GAs) [ 35 ] and particle swarm optimization (PSO) [ 36 ].…”
Section: Introductionmentioning
confidence: 99%