2006
DOI: 10.1109/tnn.2005.860885
|View full text |Cite
|
Sign up to set email alerts
|

Tuning the Structure and Parameters of a Neural Network by Using Hybrid Taguchi-Genetic Algorithm

Abstract: In this paper, a hybrid Taguchi-genetic algorithm (HTGA) is applied to solve the problem of tuning both network structure and parameters of a feedforward neural network. The HTGA approach is a method of combining the traditional genetic algorithm (TGA), which has a powerful global exploration capability, with the Taguchi method, which can exploit the optimum offspring. The Taguchi method is inserted between crossover and mutation operations of a TGA. Then, the systematic reasoning ability of the Taguchi method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
127
0
3

Year Published

2008
2008
2020
2020

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 268 publications
(130 citation statements)
references
References 27 publications
0
127
0
3
Order By: Relevance
“…Such examples are as follows: in [209], a PSO-PSO method was proposed, in which a PSO (inner PSO block) was applied for optimizing weights that were nested under another PSO (outer PSO block) which was applied for optimizing the architecture of FNN by adding or deleting hidden node. Similarly, in [210,211], a hybrid Taguchi-genetic algorithm was proposed for optimizing the FNN architecture and weights, where authors used a genetic representation of the weights, but they select structure using constructive method (by adding hidden nodes one-by-one). A multidimensional PSO approach was proposed in [212] for constructing FNN automatically by using an architectural (topological) space.…”
Section: Architecture Plus Weight Optimizationmentioning
confidence: 99%
“…Such examples are as follows: in [209], a PSO-PSO method was proposed, in which a PSO (inner PSO block) was applied for optimizing weights that were nested under another PSO (outer PSO block) which was applied for optimizing the architecture of FNN by adding or deleting hidden node. Similarly, in [210,211], a hybrid Taguchi-genetic algorithm was proposed for optimizing the FNN architecture and weights, where authors used a genetic representation of the weights, but they select structure using constructive method (by adding hidden nodes one-by-one). A multidimensional PSO approach was proposed in [212] for constructing FNN automatically by using an architectural (topological) space.…”
Section: Architecture Plus Weight Optimizationmentioning
confidence: 99%
“…However, it is very difficult to find a global optimum using only TM because an OA consists of a limited number and limited level sizes of design variables [4]. In order to resolve the drawbacks of TM, hybrid methods have been introduced [5][6][7]. In one study [5], a hybrid method which combined TM with an evolution strategy (ES) was proposed.…”
Section: Introductionmentioning
confidence: 99%
“…Subsequently, based on the result acquired from the ES, a robust antenna configuration is obtained by TM. TM with a genetic algorithm (GA) or particle swarm optimization (PSO) was also investigated in schemes where robust genes or particles were selected and utilized [6,7].…”
Section: Introductionmentioning
confidence: 99%
“…In the past decade, Evolutionary Algorithms (EAs) (Back et al 1997;Eiben and Smith 2003) and Artificial Neural Networks (ANNs) (Bishop 2006) have been combined as a key research area, providing an interesting platform for simultaneously optimizing both the weights and the architecture of Multilayer Perceptrons (MLPs) (Tsai et al 2006;Leung et al 2003;Yao 1999;Maniezzo 1993) while avoiding the shortcomings of traditional Backpropagation (Angeline et al 1994). Evolutionary Programming (EP) was originally proposed by Fogel et al (1966) and it is one of the most renowned branches of EAs whose main variation operator is mutation.…”
Section: Introductionmentioning
confidence: 99%