2010
DOI: 10.1162/evco.2010.18.2.18202
|View full text |Cite
|
Sign up to set email alerts
|

Enabling the Extended Compact Genetic Algorithm for Real-Parameter Optimization by Using Adaptive Discretization

Abstract: An adaptive discretization method, called split-on-demand (SoD), enables estimation of distribution algorithms (EDAs) for discrete variables to solve continuous optimization problems. SoD randomly splits a continuous interval if the number of search points within the interval exceeds a threshold, which is decreased at every iteration. After the split operation, the nonempty intervals are assigned integer codes, and the search points are discretized accordingly. As an example of using SoD with EDAs, the integra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
22
0

Year Published

2011
2011
2019
2019

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 19 publications
(22 citation statements)
references
References 22 publications
0
22
0
Order By: Relevance
“…Finally, some regions of the search space are more densely covered with high quality solutions whereas others contain mostly solutions of low quality; this suggests that some regions require a more dense discretization than others. To deal with these difficulties, various approaches to adaptive discretization were developed using EDAs (Tsutsui, Pelikan, & Goldberg, 2001;Pelikan, Goldberg, & Tsutsui, 2003;Chen, Liu, & Chen, 2006;Suganthan, Hansen, Liang, Deb, Chen, Auger, & Tiwari, 2005;Chen & Chen, 2010). We discuss some of these next.…”
Section: Discretizationmentioning
confidence: 99%
See 1 more Smart Citation
“…Finally, some regions of the search space are more densely covered with high quality solutions whereas others contain mostly solutions of low quality; this suggests that some regions require a more dense discretization than others. To deal with these difficulties, various approaches to adaptive discretization were developed using EDAs (Tsutsui, Pelikan, & Goldberg, 2001;Pelikan, Goldberg, & Tsutsui, 2003;Chen, Liu, & Chen, 2006;Suganthan, Hansen, Liang, Deb, Chen, Auger, & Tiwari, 2005;Chen & Chen, 2010). We discuss some of these next.…”
Section: Discretizationmentioning
confidence: 99%
“…The resulting algorithm was shown to be successful on the two-peaks and deceptive functions. Another way to deal with discretization was proposed by Chen and Chen (2010). Their method uses the ECGA model and a split-on-demand (SoD) discretization to adjust on the fly how the realvalued variables are coded as discrete values.…”
Section: Discretizationmentioning
confidence: 99%
“…To enhance the applicability of EDAs over continuous domains, direct attempts to modify the type of decision variables have been made, including continuous population-based incremental learning with Gaussian distribution [31], real-coded variant of population-based incremental learning with interval updating [32], Bayesian evolutionary algorithms for continuous function optimization [33], real-coded extended compact genetic algorithm based on mixtures of models [18], and the real-coded Bayesian optimization algorithm [1]. Instead of modifying the infrastructure of the algorithm, such as the type of decision variables or the global program flow, as a more general, component-wise approach, discretization methods are employed to cooperate with EDAs [6], [7], [28], [35]. Discretization methods enable EDAs designed for discrete variables to solve continuous optimization problems without the need of altering algorithmic structures.…”
Section: Introductionmentioning
confidence: 99%
“…Existing discretization algorithms, such as the fixed-height histogram [6] (FHH) and the split-on-demand [7]- [9] (SoD), use only the densities of selected chromosomes to discretize the continuous region.…”
Section: Introductionmentioning
confidence: 99%
“…At the last section, we focus on comparing these newly formed discretization algorithms with other discretization algorithms, their original versions, FHH and SoD. The discretization algorithms are integrated with ECGA to be tested on the 25 benchmark problems [11] used in the SoD paper [7]. Results show that ev-FHH and dev-FHH outperforms FHH, and ev-SoD and dev-SoD outperforms SoD on about 20 out of 25 benchmark problems [11] with 95% confidence.…”
Section: Introductionmentioning
confidence: 99%