2009
DOI: 10.2174/138620709788488984
|View full text |Cite
|
Sign up to set email alerts
|

Controlling Feature Selection in Random Forests of Decision Trees Using a Genetic Algorithm: Classification of Class I MHC Peptides

Abstract: Feature selection is an important challenge in many classification problems, especially if the number of features greatly exceeds the number of examples available. We have developed a procedure--GenForest--which controls feature selection in random forests of decision trees by using a genetic algorithm. This approach was tested through our entry into the Comparative Evaluation of Prediction Algorithms 2006 (CoEPrA) competition (accessible online at: http://www.coepra.org). CoEPrA was a modeling competition org… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0

Year Published

2013
2013
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 10 publications
0
12
0
Order By: Relevance
“…BBO as an optimization algorithm in combination with a classifier works very well as a population optimizer as indicated by the results. Apart from the winner results, a source of comparison for BBO-SVM and BBO-RF are also from the methods suggested by Patil et al (2009), Hansen et al (2009), Kavuk et al (2010, Kavuk et al (2011). Patil et al (2009) used an ACO based feature selection approach with RF as a wrapper and reported very good results for all the CoEPrA classification tasks.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…BBO as an optimization algorithm in combination with a classifier works very well as a population optimizer as indicated by the results. Apart from the winner results, a source of comparison for BBO-SVM and BBO-RF are also from the methods suggested by Patil et al (2009), Hansen et al (2009), Kavuk et al (2010, Kavuk et al (2011). Patil et al (2009) used an ACO based feature selection approach with RF as a wrapper and reported very good results for all the CoEPrA classification tasks.…”
Section: Discussionmentioning
confidence: 99%
“…Later Patil et al (Patil et al, 2009) had used Ant Colony Optimization as a wrapper with Random Forest for simultaneous feature selection and prediction of CoEPrA classification data. Hansen et al (2009) used Genetic Algorithm for feature selection and constructed their training models using Random Forests. Kavuk et al (2010) employed a new type of loss function called weighted biased regression for the regression datasets.…”
Section: Related Workmentioning
confidence: 99%
“…al. focused on classification of Peptides using random forests and genetic algorithms to conduct feature selection [14].…”
Section: Random Forests and Genetic Algorithmsmentioning
confidence: 99%
“…This is because GA codes are portable and require relatively little interfacing between the GA part of code and the intended application [43]. GAs can also be coupled with other artificial intelligence approaches, including pattern recognition methods and decision trees [34,[43][44][45].…”
Section: Genetic Algorithmsmentioning
confidence: 99%
“…Symmetry can sometimes be used to determine the nodal topology but this is obviously not possible in general [32]. Lester and co-workers have attempted to describe and classify nodal pockets using machine learning techniques such as random forests of decision trees [33,34]. More recently, several attempts to determine or improve the nodal hypersurface on the fly have been described including self-healing DMC [7,30,31,35].…”
Section: Introductionmentioning
confidence: 99%