2017
DOI: 10.18100/ijamec.2017331879
|View full text |Cite
|
Sign up to set email alerts
|

Particle Swarm Optimization for Continuous Function Optimization Problems

Abstract: In this paper, particle swarm optimization is proposed for finding the global minimum of continuous functions and tested on benchmark problems. Particle swarm optimization applied on 21 benchmark test functions, and its solutions are compared to those former proposed approaches: ant colony optimization, a heuristic random optimization, the discrete filled function algorithm, an adaptive random search, dynamic random search technique and random selection walk technique. The implementation of the PSO on several … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…The algorithm proposed is a multi-run PSO, which addresses the nondeterministic aspect of metaheuristics and provides a higher-quality result than a single-run PSO. Compared to other metaheuristics, the PSO is a natural fit for continuous optimization functions as particles evolve in a continuous multidimensional space [20]. To address the long execution time of the metaheuristic, the algorithm is parallelized on CPUs using task-level parallelization and on GPUs using data-level parallelization, and the performance results of both methods are thoroughly compared.…”
Section: Introductionmentioning
confidence: 99%
“…The algorithm proposed is a multi-run PSO, which addresses the nondeterministic aspect of metaheuristics and provides a higher-quality result than a single-run PSO. Compared to other metaheuristics, the PSO is a natural fit for continuous optimization functions as particles evolve in a continuous multidimensional space [20]. To address the long execution time of the metaheuristic, the algorithm is parallelized on CPUs using task-level parallelization and on GPUs using data-level parallelization, and the performance results of both methods are thoroughly compared.…”
Section: Introductionmentioning
confidence: 99%
“…They analyzed the method from various views of that are the structure, parameters, discrete PSO, parallel PSO, and multiobjective. Ozdemir [6] used the PSO to reach to the global minimum of the function suggested. He applied the method on different benchmark functions.…”
Section: Introductionmentioning
confidence: 99%