2019
DOI: 10.1016/j.swevo.2019.03.015
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble prediction-based dynamic robust multi-objective optimization methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
39
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 112 publications
(39 citation statements)
references
References 32 publications
0
39
0
Order By: Relevance
“…Three metrics are employed to analyze the performance of the schemes. The inverted generational distance (IGD) [29] calculates the distance between the objective values of the Pareto optimal solutions and the true Pareto front.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…Three metrics are employed to analyze the performance of the schemes. The inverted generational distance (IGD) [29] calculates the distance between the objective values of the Pareto optimal solutions and the true Pareto front.…”
Section: Experimental Results and Analysismentioning
confidence: 99%
“…If X i is a producer (29) Producers: update the position X i of birds using equation 5; (30) Else (31) Scroungers: update the position X i of birds using equation (6 6 Computational Intelligence and Neuroscience…”
Section: N);mentioning
confidence: 99%
“…Dynamic multi-swarm method has been widely used in real-world applications, because it is efficient and easy to implement. In addition, it is very common in the improvement of swarm intelligent optimization, such as coevolutionary algorithm [ 26 ], the framework of evolutionary algorithms [ 27 ], multiobjective particle swarm optimization [ 28 ], hybrid dynamic robust [ 29 ], and PSO algorithm [ 30 , 31 ]. However, the PSO algorithm is easy to fall into the local optimum and its generalization performance is not high.…”
Section: Bird Swarm Algorithm and Its Improvementmentioning
confidence: 99%
“…However, weights cannot be utilized as the criteria to distinguish the first category from the second category, since a series of multi-objective algorithms also utilize weights to determine a complete Pareto front, such as: the multiobjective bat algorithm [33], which utilizes K randomly chosen weights to solve K-objective problems; the multiobjective evolutionary algorithm based on decomposition [34], [35], which decomposes a MOP into several subproblems and solve them by aggregation, including the weighted sum approach, the Tchebycheff approach and the boundary intersection approach; the multi-objective evolutionary algorithm based on decomposition with adaptive replacement strategies [36], which utilizes a sigmoid function to adaptively adjust replaced neighbors of individuals in MOEA/D; the multi-objective evolutionary algorithm based on decomposition with composite operator selection [35], which introduces four types of cross-mutate operations for evolutionary computations; the multi-objective crow search algorithm [37], which utilizes a set of determined weight vectors and employees the max-min strategy; All of them are trying to search for a significant Pareto front while applying weights. To guarantee the robustness of different multi-objective optimization algorithms and solve the dynamic multi-objective optimization problems [38], prediction models such as moving average, autoregressive and single exponential smoothing can also be aggregate with weights to achieve this goal. Since these multi-objective optimization algorithms do not aggregate the MOPs into single-objective optimization problems to find out a global best solution, they all belong to the second category.…”
Section: Related Workmentioning
confidence: 99%