2019
DOI: 10.1155/2019/2401818
|View full text |Cite
|
Sign up to set email alerts
|

Dynamically Dimensioned Search Embedded with Piecewise Opposition-Based Learning for Global Optimization

Abstract: Dynamically dimensioned search (DDS) is a well-known optimization algorithm in the field of single solution-based heuristic global search algorithms. Its successful application in the calibration of watershed environmental parameters has attracted researcher’s extensive attention. The dynamically dimensioned search algorithm is a kind of algorithm that converges to the global optimum under the best condition or the good local optimum in the worst case. In other words, the performance of DDS is easily affected … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 6 publications
(9 citation statements)
references
References 39 publications
0
9
0
Order By: Relevance
“…Step 2.2 If the valve-point loading effects are considered in the ELD problem, then map these initialized grey wolf individuals to the feasible domain of the practical operation constraints according to Equation (26), and employ the loss coefficients B, B 0 , and B 00 to calculate the transmission loss P loss while using Equation (17). Calculate the total cost function of n committed generation units using Equation (27) as the fitness value.…”
Section: Implementation Steps Of Ngwo To Eld Problemmentioning
confidence: 99%
See 1 more Smart Citation
“…Step 2.2 If the valve-point loading effects are considered in the ELD problem, then map these initialized grey wolf individuals to the feasible domain of the practical operation constraints according to Equation (26), and employ the loss coefficients B, B 0 , and B 00 to calculate the transmission loss P loss while using Equation (17). Calculate the total cost function of n committed generation units using Equation (27) as the fitness value.…”
Section: Implementation Steps Of Ngwo To Eld Problemmentioning
confidence: 99%
“…In other words, one approach may show very promising results on a particular class of problems, but the same algorithm may show poor results on a different set of problems [24]. Therefore, more researchers improve the current approaches or propose new meta-heuristics for solving different complex problems every year, such as the dragonfly algorithm, is hybridized with the improved Nelder-Mead algorithm (INMDA) for function optimization and multilayer perceptron training [25], the dynamically dimensioned search is improved by embedding with piecewise opposition-based learning (DDS-POBL) for global optimization [26], and this also motivates our attempts in this paper to improve the GWO algorithm for solving complex ELD problems.…”
Section: Introductionmentioning
confidence: 99%
“…As the number of iterations approached the maximum, the algorithm evolved into a local search. e key idea for the DDS algorithm to transit from a global search to a local search is to dynamically and probabilistically reducing the number of dimensions to be perturbed in the neighborhood of the current best solution [11,43]. e operation to dynamically and probabilistically reduce the number of dimensions to be perturbed can be summarized as follows: in each iteration, the jth variable is randomly selected with the probability P t from m decision variables for inclusion in the neighborhood I perturb .…”
Section: Dds Algorithmmentioning
confidence: 99%
“…e guided local search (GLS) approach is introduced for multiuser detection in ultra-wideband systems [6]. e dynamically dimensioned search (DDS) is introduced for automatic calibration in watershed simulation models [7][8][9][10][11].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation