Appl.Math. 2017
DOI: 10.21136/am.2017.0274-16
|View full text |Cite
|
Sign up to set email alerts
|

The classic differential evolution algorithm and its convergence properties

Abstract: Institute of Mathematics of the Czech Academy of Sciences provides access to digitized documents strictly for personal use. Each copy of any part of this document must contain these Terms of use.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 29 publications
(19 citation statements)
references
References 6 publications
0
18
0
Order By: Relevance
“…The evolutionary technique, such as genetic algorithm (GA), presents a significant shortcoming as its convergence speed slows down near the global optimum [18,19]. Similarly, particle swarm optimization (PSO) and differential evolution (DE) operate with a high convergence rate but offer premature convergence which is a critical drawback [20,21]. In the literature, penalty-based approaches, such as Lagrangian technique and logarithmic-barrier function (LBF) method, have proven their potentials to obtain the optimum solution with a high convergence speed [22,23].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The evolutionary technique, such as genetic algorithm (GA), presents a significant shortcoming as its convergence speed slows down near the global optimum [18,19]. Similarly, particle swarm optimization (PSO) and differential evolution (DE) operate with a high convergence rate but offer premature convergence which is a critical drawback [20,21]. In the literature, penalty-based approaches, such as Lagrangian technique and logarithmic-barrier function (LBF) method, have proven their potentials to obtain the optimum solution with a high convergence speed [22,23].…”
Section: Methodsmentioning
confidence: 99%
“…The evaluation of the derivative term (i.e., ∇ ( , )) is required to move in the gradient descent direction as shown in (12). The needed derivative term is obtained by putting (20), (21), (22), and (23) into (18). Hence, ∇ ( , ) is defined by (24).…”
Section: Methodsmentioning
confidence: 99%
“…In this section, the asymptotic convergence of the local optimization is first analyzed; on this basis, the global convergence analysis is presented as well. [45]. Suppose that at the k th time horizon, the optimal solution set of the cost function J k can be defined as follows:…”
Section: Convergence Analysis Of De Algorithmmentioning
confidence: 99%
“…In article [6] the theoretical concepts are introduced that are then used to prove the asymptotic convergence of MDEA.…”
Section: -3mentioning
confidence: 99%