2010
DOI: 10.1007/s00521-010-0504-3
|View full text |Cite
|
Sign up to set email alerts
|

Metaheuristics for the feedforward artificial neural network (ANN) architecture optimization problem

Abstract: This article deals with evolutionary artificial neural network (ANN) and aims to propose a systematic and automated way to find out a proper network architecture. To this, we adapt four metaheuristics to resolve the problem posed by the pursuit of optimum feedforward ANN architecture and introduced a new criteria to measure the ANN performance based on combination of training and generalization error. Also, it is proposed a new method for estimating the computational complexity of the ANN architecture based on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
40
0
2

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(42 citation statements)
references
References 19 publications
0
40
0
2
Order By: Relevance
“…For example, local search algorithms like SA, TS, GRAP, VNS [98], estimations of distribution algorithm [187] and global search algorithms like GA, ACO, and memetic algorithm, were examined thoroughly in [186]. Additionally, many researchers studied the performance of metaheuristic algorithms for the training of the FNN and reported that the metaheuristic approaches outperform all the conventional methods by a huge margin [188][189][190].…”
Section: Weight Optimizationmentioning
confidence: 99%
“…For example, local search algorithms like SA, TS, GRAP, VNS [98], estimations of distribution algorithm [187] and global search algorithms like GA, ACO, and memetic algorithm, were examined thoroughly in [186]. Additionally, many researchers studied the performance of metaheuristic algorithms for the training of the FNN and reported that the metaheuristic approaches outperform all the conventional methods by a huge margin [188][189][190].…”
Section: Weight Optimizationmentioning
confidence: 99%
“…More recently, [129] has shown the benefits of optimizing each neuron's transfer function, creating heterogeneous networks. In a similar approach, [130] present a methodology to find the best architecture of a neural network using metaheuristics. The authors tested the following ones: generalized extremal optimization, VNS, SA, and canonical GA.…”
Section: Using Metaheuristics To Improve Machine Learningmentioning
confidence: 99%
“…The optimal neural network topology can be expressed to minimize a cost function. Metaheuristics used to compute the best topology include hybrid Particle Swarm Optimization (PSO) and Genetic Algorithms (GA) [22], Constructive Method and Pruning [5], or applying severe algorithms: a comparison among Genetic Algorithm (GA), Simulated Annealing (SA), Generalized Extreme Optimization (GEO), and Variable Neighbourhood Search (VNS) [3].…”
Section: Self-configured Mlp-nnmentioning
confidence: 99%