1996
DOI: 10.1007/3-540-61723-x_994
|View full text |Cite
|
Sign up to set email alerts
|

The effect of extensive use of the mutation operator on generalization in genetic programming using sparse data sets

Abstract: Abstract. Ordinarily, Genetic Programming uses little or no mutation. Crossover is the predominant operator. This study tests the effect of a very aggressive use of the mutation operator on the generalization performance of our Compiling Genetic Programming System ('CPGS'). We ran our tests on two benchmark classification problems on very sparse training sets. In all, we performed 240 complete runs of population 3000 for each of the problems, varying mutation rate between 5% and 80~ We found that increasing th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
43
0
2

Year Published

2005
2005
2020
2020

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 54 publications
(45 citation statements)
references
References 4 publications
0
43
0
2
Order By: Relevance
“…In particular, in Francone et al (1996) they introduce a GP system called Compiling GP and they compared its generalization ability with that of other machine-learning paradigms. Furthermore, in Banzhaf et al (1996) they show the positive effect of an extensive use of the mutation operator on generalization in GP using sparse data sets. (1) the selection of the best-ofrun individuals using a three-data-sets methodology, and (2) the application of parsimony pressure to reduce the complexity of the solutions.…”
Section: Generalization In Gpmentioning
confidence: 93%
“…In particular, in Francone et al (1996) they introduce a GP system called Compiling GP and they compared its generalization ability with that of other machine-learning paradigms. Furthermore, in Banzhaf et al (1996) they show the positive effect of an extensive use of the mutation operator on generalization in GP using sparse data sets. (1) the selection of the best-ofrun individuals using a three-data-sets methodology, and (2) the application of parsimony pressure to reduce the complexity of the solutions.…”
Section: Generalization In Gpmentioning
confidence: 93%
“…These techniques use basically a fitness function to guide the search e.g., gather fittest solutions over generations. These techniques are good since they reward individuals with high score but they do not favor diversity and the search may converge to many local optima [3], [4]. The idea of NS, introduced by Lehman and Stanley in 2008 [8], represents an alternative solution for this issue.…”
Section: B Novelty Searchmentioning
confidence: 99%
“…The issue of premature convergence to local optima has been a common problem in GAs. Many methods are proposed to avoid this problem [3], [4]. However, all these alternatives use a fitness-based selection to guide the search.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…A survey of the main contributions on generalization in GP has been done some years ago by Kushchu in [68]. In [34] the authors use what they called the "Compiling GP System" to compare its generalization ability with that of other ML paradigms and show in [8] the positive effect of an extensive use of the mutation operator on generalization in GP with sparse data sets. In [19], Da Costa and Landry have recently proposed a new GP model called Relaxed GP, showing its generalization ability.…”
Section: Generalization In Gpmentioning
confidence: 99%