2010
DOI: 10.1007/s00500-010-0636-5
|View full text |Cite
|
Sign up to set email alerts
|

Three modified versions of differential evolution algorithm for continuous optimization

Abstract: Differential evolution (DE) is one simple and effective evolutionary algorithm (EA) for global optimization. In this paper, three modified versions of the DE to improve its performance, to repair its defect in accurate converging to individual optimal point and to compensate the limited amount of search moves of original DE are proposed. In the first modified version called bidirectional differential evolution (BDE), to generate a new trial point, is used from the bidirectional optimization concept, and in the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
13
0

Year Published

2012
2012
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 30 publications
1
13
0
Order By: Relevance
“…Firstly, the focus of the experiments is to compare the performance of the proposed algorithms with the original SDE algorithm proposed by Ahandani et al (2010). Then, a Algorithm 1 (the SOBDE algorithm)…”
Section: Computational Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…Firstly, the focus of the experiments is to compare the performance of the proposed algorithms with the original SDE algorithm proposed by Ahandani et al (2010). Then, a Algorithm 1 (the SOBDE algorithm)…”
Section: Computational Resultsmentioning
confidence: 99%
“…Ahandani et al (2010) by inspiring from partitioning and shuffling processes employed in the SFL algorithm gave the parallel search ability to the DE algorithms and called them as the SDE. The SDE such as SFL divides the population into several subsets referred to as memeplexes and each memeplex is improved by the DE.…”
Section: The Modified Versions Of Dementioning
confidence: 99%
See 2 more Smart Citations
“…Optimization techniques range widely from the early gradient techniques 1 to the latest random techniques 16,18,19 including ant colony optimization 13,17 . Gradient techniques are very powerful when applied to smooth well-behaved objective functions, and especially, when applied to a monotonic function with a single optimum.…”
Section: Introductionmentioning
confidence: 99%