2017
DOI: 10.1142/s0218126617501857
|View full text |Cite
|
Sign up to set email alerts
|

Training a Feed-Forward Neural Network Using Particle Swarm Optimizer with Autonomous Groups for Sonar Target Classification

Abstract: Feed-Forward Neural Networks (FFNNs), as one of the wide-spreading Arti¯cial NNs, has been used to solve many practical problems such as classi¯cation of the sonar dataset. Improper selection of the training method, which is an important part of the design process, results in slow convergence rate, entrapment in local minima, and sensitivity to initial conditions. To overcome these issues, the recently proposed method known as \Particle Swarm Optimizer with Autonomous Groups (AGPSO)" has been used in this pape… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
11
0
1

Year Published

2017
2017
2022
2022

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(12 citation statements)
references
References 38 publications
0
11
0
1
Order By: Relevance
“…where i represents the index, xi is one component of the input vector, wi is the weight for each xi, g(•) is the transfer function, which can take many function forms, such as log-sigmoid, tan-sigmoid, and purelin. On the right panel of Figure 2, a BP neural network with three layers containing one input layer, single hidden layer, and one output layer were illustrated; it is a backward propagation of errors [37], the data were propagated from the input layer to the output layer through the hidden layer; while the error was transmitted in the opposite direction, thereby correcting the connection weight (w nm ) of the network. The final error became smaller and smaller.…”
Section: Ann Modelmentioning
confidence: 99%
“…where i represents the index, xi is one component of the input vector, wi is the weight for each xi, g(•) is the transfer function, which can take many function forms, such as log-sigmoid, tan-sigmoid, and purelin. On the right panel of Figure 2, a BP neural network with three layers containing one input layer, single hidden layer, and one output layer were illustrated; it is a backward propagation of errors [37], the data were propagated from the input layer to the output layer through the hidden layer; while the error was transmitted in the opposite direction, thereby correcting the connection weight (w nm ) of the network. The final error became smaller and smaller.…”
Section: Ann Modelmentioning
confidence: 99%
“…In addition, there are many improved methods for the shortcomings of different algorithms. Mosavi and Khishe [50] proposed an autonomous groups particles swarm optimization (AGPSO) algorithm. He believes that the behavior of groups is usually determined by the individual or representative that they follow.…”
Section: Introductionmentioning
confidence: 99%
“…Consequently, this has led to the development of new and advanced meta-heuristic algorithms for training MLP such as the hybrid PSO-GSA [42], PSO with Autonomous Groups (PSOAG) [43], Invasive Weed Optimiser (IWO) [44], Chemical Reaction Optimiser (CRO) [45], Stochastic Fractal Search (SFS) [46], Biogeography-Based Optimizer (BBO) [47], Adaptive Best-Mass GSA (ABMGSA) [48], Chimp Optimisation Algorithm (COA) [49], Dragonfly Optimisation Algorithm (DOA) [50], Salp Swarm Optimiser (SSO) [51], Social Spider Optimisation Algorithm (SSOA), Grey Wolf Optimisation (GWO) [41], Equilibrium Optimiser (EO) [52], Sine Cosine Algorithm (SCA) [53], Modified Sine Cosine Algorithm (MSCA) [54], Whale Optimisation Algorithm (WOA) [55], Improved WOA [56], Modified WOA [57] among others.…”
Section: Introductionmentioning
confidence: 99%
“…The comparison was done by applying the GWO-MLP classifier on both the original dataset (without clustering) and the balanced clustered dataset. The performance of GWO-MLP on these distinct datasets was assessed using the eight performance indicators of classifiers (Eqs (40)(41)(42)(43)(44)(45)(46)(47)…”
mentioning
confidence: 99%