2014 9th International Conference on Industrial and Information Systems (ICIIS) 2014
DOI: 10.1109/iciinfs.2014.7036522
|View full text |Cite
|
Sign up to set email alerts
|

Genetic algorithm based wrapper feature selection on hybrid prediction model for analysis of high dimensional data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
12
0
2

Year Published

2015
2015
2021
2021

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 8 publications
0
12
0
2
Order By: Relevance
“…Several classification techniques have been applied on the PID dataset such as ANNs, Bayesian-based approaches, fuzzy logic, decision trees, K-means, Support Vector Machines, random forests (RF), Genetic algorithms, and K-Nearest Neighbors, for producing the risk of developing T2DM (Anirudha, Kannan, & Patil, 2014;Belciug & Gorunescu, 2014;Bioch, Meer, & Potharst, 1996;Carpenter & Markuzon, 1998;Gürbüz & Kiliç, 2014;Ilango & Ramaraj, 2010;Kahramanli & Allahverdi, 2008;Michie, Spiegelhalter, & Taylor, 1994;Nanni, Fantozzi, & Lazzarini, 2015;Patil, Joshi, & Toshniwal, 2010;Perez, Yanez-Marquez, Camacho-Nieto, Lopez-Yanez, & Arguelles-Cruz, 2015;Purwar & Singh, 2015;Seera & Lim, 2014;Sutanto & Ghani, 2015;Yilmaz, Inan, & Uzer, 2014;Zhu, Xie, & Zheng, 2015). In order to provide evidence of advancing the current state of the art, the performance of the proposed models has been comparatively assessed with those obtained by applying logistic regression, Bayesian-based approaches, and decision trees.…”
Section: Introductionmentioning
confidence: 99%
“…Several classification techniques have been applied on the PID dataset such as ANNs, Bayesian-based approaches, fuzzy logic, decision trees, K-means, Support Vector Machines, random forests (RF), Genetic algorithms, and K-Nearest Neighbors, for producing the risk of developing T2DM (Anirudha, Kannan, & Patil, 2014;Belciug & Gorunescu, 2014;Bioch, Meer, & Potharst, 1996;Carpenter & Markuzon, 1998;Gürbüz & Kiliç, 2014;Ilango & Ramaraj, 2010;Kahramanli & Allahverdi, 2008;Michie, Spiegelhalter, & Taylor, 1994;Nanni, Fantozzi, & Lazzarini, 2015;Patil, Joshi, & Toshniwal, 2010;Perez, Yanez-Marquez, Camacho-Nieto, Lopez-Yanez, & Arguelles-Cruz, 2015;Purwar & Singh, 2015;Seera & Lim, 2014;Sutanto & Ghani, 2015;Yilmaz, Inan, & Uzer, 2014;Zhu, Xie, & Zheng, 2015). In order to provide evidence of advancing the current state of the art, the performance of the proposed models has been comparatively assessed with those obtained by applying logistic regression, Bayesian-based approaches, and decision trees.…”
Section: Introductionmentioning
confidence: 99%
“…The use of evolutionary strategies for the selection of features has been initially proposed in [19]. Since then, it has been regarded as a powerful tool for feature selection in machine learning [17] and proposed by numerous authors as a search strategy (see, e.g., [20], [21]). Multi-objective evolutionary algorithms are designed to solve a set of minimization/maximization problems for a tuple of n functions…”
Section: Introductionmentioning
confidence: 99%
“…The use of evolutionary strategies for the selection of features has been initially proposed in Siedlecki and Sklansky (). Since then, it has been regarded as a powerful tool for feature selection in machine learning (Vafaie & Jong, ) and proposed by numerous authors as a search strategy (see, e.g., (ElAlami, ; Anirudha, Kannan, & Patil, )). Multi‐objective evolutionary algorithms are designed to solve a set of minimization/maximization problems for a tuple of n functions f1(),xtrue→,italic…,fn(),xtrue→, where truex is a vector of parameters belonging to a given domain.…”
Section: Introductionmentioning
confidence: 99%