2015
DOI: 10.1016/j.apm.2014.12.034
|View full text |Cite
|
Sign up to set email alerts
|

A robust parallel algorithm of the particle swarm optimization method for large dimensional engineering problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
13
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(13 citation statements)
references
References 29 publications
0
13
0
Order By: Relevance
“…Our implementation of PSO is based on ref. [25]. Our implementation of SS follows the outline presented in the introduction of ref.…”
Section: Implementation Detailsmentioning
confidence: 99%
See 1 more Smart Citation
“…Our implementation of PSO is based on ref. [25]. Our implementation of SS follows the outline presented in the introduction of ref.…”
Section: Implementation Detailsmentioning
confidence: 99%
“…Such algorithms often include some type of iterative randomized selection of candidate parameter sets, followed by evaluation of the selected parameter sets, which is used to direct the selection of parameters in future iterations to more favorable regions of parameter space. Many modern descriptions of metaheuristics allow for parallelized evaluation of parameter sets (Moraes et al., 2015, Penas et al., 2015, Penas et al., 2017), which is valuable when each model simulation is computationally expensive. Although these algorithms are well-established, software designed for biological applications is limited.…”
Section: Introductionmentioning
confidence: 99%
“…A feature of many but not all of these algorithms is the maintenance of a population of good parameter sets, which are used to generate new trial parameter sets. Many modern descriptions of population-based metaheuristic algorithms (e.g [39,40,41]) allow for parallelized function evaluations within a single run of the algorithm, which enables these algorithms to take advantage of high-performance computing resources.…”
Section: Metaheuristic Optimizationmentioning
confidence: 99%
“…Note that the parallelization of these algorithms is not simply from performing multiple fitting replicates (which can be trivially done for any algorithm); evaluations are parallelized within each iteration of the algorithm. Some metaheuristic algorithms (e.g., [40]) are asynchronous. Such algorithms improve load balancing by running simulations on all available cores at all times (cores are never left idle).…”
Section: Metaheuristic Optimizationmentioning
confidence: 99%
“…The same is also a valid case for algorithms that are inherently feasible for parallelism such as a genetic algorithm (GA). However, the real implementation of parallelism for such algorithms may be challenged by the fact that many current computing platforms inherent serial processing mechanism of von-Neumann architecture [1,2]. This makes the implementation of parallel information processing complicated and expensive [3,4].…”
Section: Introductionmentioning
confidence: 99%