IEEE Congress on Evolutionary Computation 2010
DOI: 10.1109/cec.2010.5585988
|View full text |Cite
|
Sign up to set email alerts
|

Improving GP classification performance by injection of decision trees

Abstract: Abstract-This paper presents a novel hybrid method combining genetic programming and decision tree learning. The method starts by estimating a benchmark level of reasonable accuracy, based on decision tree performance on bootstrap samples of the training set. Next, a normal GP evolution is started with the aim of producing an accurate GP. At even intervals, the best GP in the population is evaluated against the accuracy benchmark. If the GP has higher accuracy than the benchmark, the evolution continues normal… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2011
2011
2016
2016

Publication Types

Select...
4
3

Relationship

2
5

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 24 publications
(25 reference statements)
0
4
0
Order By: Relevance
“…During the evaluation the framework achieved a high degree of prediction accuracy and a high degree of sensitivity. In order to obtain complete details about the characteristics of the framework, our achieved results were compared with the results achieved by the following hybrid models: DTiGP -Functional Tree (FT) classifier [19], C4.5+NB -Decision Tree + Naive Bayes classifier [20], GDADT -Genetic based Data Adaptation (GDA)+ Decision Tree (DT) [21], BFT -Best First Tree [22], REP -Regression Tree + Information Gain [22], GA+C4.5 -Genetic Algorithm(GA) + Decision Tree (RGDT)…”
Section: B Evaluation Of Prediction Accuracymentioning
confidence: 99%
“…During the evaluation the framework achieved a high degree of prediction accuracy and a high degree of sensitivity. In order to obtain complete details about the characteristics of the framework, our achieved results were compared with the results achieved by the following hybrid models: DTiGP -Functional Tree (FT) classifier [19], C4.5+NB -Decision Tree + Naive Bayes classifier [20], GDADT -Genetic based Data Adaptation (GDA)+ Decision Tree (DT) [21], BFT -Best First Tree [22], REP -Regression Tree + Information Gain [22], GA+C4.5 -Genetic Algorithm(GA) + Decision Tree (RGDT)…”
Section: B Evaluation Of Prediction Accuracymentioning
confidence: 99%
“…J48 is a classifier based on tree structure representation, where each node represents a test of individual features and each level represents a class. The input dataset is partitioned by the tree based on the information gained to select the attribute, and the output is the hierarchical structure of the input [11,12,28]. The Naive Bayes (NB) classifier uses the Bayes' rule to compute the posterior probability of each class.…”
Section: Classificationmentioning
confidence: 99%
“…In this variant, the population of genetic programs is enriched with decision trees induced using a standard technique, here J48. For more details see [19]. G-REX, just like J48, used Laplace corrections.…”
Section: Methodsmentioning
confidence: 99%