2004
DOI: 10.1109/tevc.2004.826076
|View full text |Cite
|
Sign up to set email alerts
|

On the Computation of All Global Minimizers Through Particle Swarm Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
318
0
21

Year Published

2004
2004
2023
2023

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 637 publications
(339 citation statements)
references
References 56 publications
0
318
0
21
Order By: Relevance
“…This is a system with many roots in the considered search space Ω. Figure 3 be the merit function of the system In [−5, 5] 2 , the merit function has 12 global minimizers [18]. Table 1 shows the average number of roots, N.rootsavg, the average number of function evaluations, NFEavg, and time (in seconds), Tavg, found in five experimental runs produced by our algorithm, where we implemented:…”
Section: Repulsion Merit Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…This is a system with many roots in the considered search space Ω. Figure 3 be the merit function of the system In [−5, 5] 2 , the merit function has 12 global minimizers [18]. Table 1 shows the average number of roots, N.rootsavg, the average number of function evaluations, NFEavg, and time (in seconds), Tavg, found in five experimental runs produced by our algorithm, where we implemented:…”
Section: Repulsion Merit Functionsmentioning
confidence: 99%
“…However, the same solutions may be located over and over again along the iterations and the computational effort turns out to be quite heavy. Other approaches that combine metaheuristics with techniques that modify the objective function in problem (2) have been reported in the literature [17,18,19,20]. The technique in [20] relies on the assignment of a penalty term to each previously computed root so that a repulsion area around the root is created.…”
Section: Introductionmentioning
confidence: 99%
“…Based upon a mathematical description of the social behavior of swarms, it has been shown that this algorithm can be efficiently generated to find good solutions to a certain number of complicated situations such as, for instance, the static optimization problems, the topological optimization and others (Parsopoulos, K.E. et al, 2001a); (Parsopoulos, K.E.et al 2001b); (Fourie, P.C. et al, 2000); ( Fourie, P.C.…”
Section: Particle Swarm Optimizationmentioning
confidence: 99%
“…Parameters c 1 and c 2 are individual and social learning factors, and ξ and η are random numbers in the range of [0.0, 1.0]. PSO has been widely used for stationary problems [13,14,18]. In recent years, PSO has obtained an increasing concern to solve DOPs [3].…”
Section: Particle Swarm Optimization In Dynamic Environmentsmentioning
confidence: 99%