2018
DOI: 10.3390/info9010016
|View full text |Cite
|
Sign up to set email alerts
|

Improving Particle Swarm Optimization Based on Neighborhood and Historical Memory for Training Multi-Layer Perceptron

Abstract: Many optimization problems can be found in scientific and engineering fields. It is a challenge for researchers to design efficient algorithms to solve these optimization problems. The Particle swarm optimization (PSO) algorithm, which is inspired by the social behavior of bird flocks, is a global stochastic method. However, a monotonic and static learning model, which is applied for all particles, limits the exploration ability of PSO. To overcome the shortcomings, we propose an improving particle swarm optim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 40 publications
0
8
0
Order By: Relevance
“…Moreover, GSO has been compared with BBO, GA, and MVO algorithms for classification accuracy (Alboaneen et al, 2017). Furthermore, Li (2018) has applied an improved version of PSO on the CEC 2014 test benchmarks to find the global optimum. Besides, MLP was trained to investigate the efficiency of the proposed algorithm in handling complex search spaces.…”
Section: A Hybrid Cpsogsa For Training Mlpmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, GSO has been compared with BBO, GA, and MVO algorithms for classification accuracy (Alboaneen et al, 2017). Furthermore, Li (2018) has applied an improved version of PSO on the CEC 2014 test benchmarks to find the global optimum. Besides, MLP was trained to investigate the efficiency of the proposed algorithm in handling complex search spaces.…”
Section: A Hybrid Cpsogsa For Training Mlpmentioning
confidence: 99%
“…In literature, there are many heuristic algorithms utilized for training MLPs such as differential evolution (DE) (Llonen et al, 2003;Slowik and Bialko, 2008), ant colony optimization (ACO) (Blum and Socha, 2005;Socha and Blum, 2007), genetic algorithm (GA) (Whitney et al, 1990;Mirjalili et al, 2012), artificial bee colony (ABC) (Karaboga et al, 2007;Ozturk and Karaboga, 2011) and particle swarm optimization (PSO) (Mendes et al, 2002;Gudise and Venayagamoorthy, 2003). The recent additions in the list of stochastic training algorithms include social spider optimization algorithm (SSO) (Pereira, 2014), teachinglearning based optimization (TLBO) (Uzlu et al, 2014), biogeography based optimization (BBO) (Mirjalili et al, 2014), symbiotic organisms search algorithm (SOS) (Wu et al, 2016), glowworm swarm optimization (GSO) (Alboaneen et al, 2017) and improved PSO (Li, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…PSO has been successfully applied to solve a number of problems in research and application areas including problems of optimization of functions (Logenthiran et al, 2015;Abid et al, 2017), selection and classification (Too, 2019), wireless sensor networks (Cheng et al, 2018) and learning of neural networks (Li, 2018) etc.…”
Section: Pso Theorymentioning
confidence: 99%
“…Different algorithms are used and compared to schedule loads of a smart city (Logenthiran et al, 2012;Abid et al, 2017;Li, 2018;Celik et al, 2017). The aim is to reduce the energy bill by taking into account the consumption and production data, but also the current pricing policies and any operating constraints imposed by the micro-grid manager.…”
Section: Pso Scheduling Algorithm For Demand Sidementioning
confidence: 99%
“…In the paper, the results shows that GA outperforms BPon sonar image classification problems. Later, other metaheuristics are applied successfully for weight and bias optimization [3], [4], [5], [6]. The utilization of metaheuristic based algorithms improves the search ability of neural network training for such values of weights and biases which will reduce the classification error rate more than gradient-based approaches.…”
mentioning
confidence: 99%