2015
DOI: 10.1016/j.procs.2015.12.117
|View full text |Cite
|
Sign up to set email alerts
|

Wrapper Feature Subset Selection for Dimension Reduction Based on Ensemble Learning Algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
50
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 102 publications
(50 citation statements)
references
References 13 publications
0
50
0
Order By: Relevance
“…The commonly used classification algorithms for identifying the most relevant input variables are: Naïve Bayes (Cortizo & Giraldez, 2006, Panthong & Srivihok, 2015), SVM (Maldonado & Weber, 2009, Maldonado, Weber & Famili, 2009), Random Forest (Rodin et al, 2009), Bagging (Panthong & Srivihok, 2015), AdaBoost (Panthong & Srivihok, 2015), and Extreme Learning Machines (Benoít, Van Heeswijk, Miche, Verleysen & Lendasse, 2013). These classification techniques combined with a greedy search algorithm allow for finding the optimal number of features by iteratively selecting features based on the classifier performance (Bengio et al, 2003).…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The commonly used classification algorithms for identifying the most relevant input variables are: Naïve Bayes (Cortizo & Giraldez, 2006, Panthong & Srivihok, 2015), SVM (Maldonado & Weber, 2009, Maldonado, Weber & Famili, 2009), Random Forest (Rodin et al, 2009), Bagging (Panthong & Srivihok, 2015), AdaBoost (Panthong & Srivihok, 2015), and Extreme Learning Machines (Benoít, Van Heeswijk, Miche, Verleysen & Lendasse, 2013). These classification techniques combined with a greedy search algorithm allow for finding the optimal number of features by iteratively selecting features based on the classifier performance (Bengio et al, 2003).…”
Section: Methodsmentioning
confidence: 99%
“…At each iteration, the Random Forest (RF) algorithm, incorporating a hierarchical decision tree structure was used to explore all possible subsets of the features and measure their importance with respect to the classification outcome (Gregorutti, Michel & Saint-Pierre, 2017). To assess the robustness of RF-RFE process in selecting optimal subset of features, we applied the RFE technique to another type of ensemble methods, namely, bootstrap aggregated (bagged) trees (BT) and compared the results (Panthong & Srivihok, 2015). Similarly, the BT-RFE performance was evaluated in a 10-fold cross-validation repeated five times with different split positions.…”
Section: Methodsmentioning
confidence: 99%
“…Parkinsons 16 10,21,22,17,20,18,14,9,11,12,13,5,7,4,15,8 The feature reduction achieved for CKD, Breast Cancer Wisconsin Dataset, and Parkinsons Dataset are 24%, 21.8%, and 30.4% respectively. The feature subset derived after the proposed feature selection algorithm consists of the most discriminatory features that enhances the accuracy of prediction of chronic diseases without any loss in the original information.The reduced feature subset is evaluated against three classification algorithms SVM, CNN, and Gradient Boosting.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…Experimental results reveal that PDF-based feature selection method has achieved better accuracy that the existing PLS-based feature selection method for early identification of AD. Rattanawadee Panthong and Anongnart Srivihok [13] have presented a wrapper based feature selection process for reducing the dimensionality in medical datasets. The methods that have been used in this work include SFS, SBS and ensemble algorithms namely Bagging and AdaBoost based on the evaluation of the subset.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Filter and Wrapper methods of FS have been applied by many researchers for classification problems to select candidate subset(reduced feature set) that increase the performance of classifiers [5 ] [ 6]. In addition to these two methods, embedded FS approach also have been applied for the classification problems as part of modeling process.…”
Section: Related Studymentioning
confidence: 99%