2006
DOI: 10.1007/11740698_5
|View full text |Cite
|
Sign up to set email alerts
|

Simultaneous Optimization of Weights and Structure of an RBF Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2009
2009
2015
2015

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…model complexity, representation ability, and model smoothness, and RBF Network ensemble has been constructed of this Pareto set. A new evolutionary algorithm, the RBF-Gene algorithm, has been applied to optimize RBF Networks [8]. Unlike other works, their algorithm can evolve both from the structure and the numerical parameters of the network: it is able to evolve the number of neurons and their weights.…”
Section: Related Work Of Evolutionary Multi-objective Rbf Networkmentioning
confidence: 99%
“…model complexity, representation ability, and model smoothness, and RBF Network ensemble has been constructed of this Pareto set. A new evolutionary algorithm, the RBF-Gene algorithm, has been applied to optimize RBF Networks [8]. Unlike other works, their algorithm can evolve both from the structure and the numerical parameters of the network: it is able to evolve the number of neurons and their weights.…”
Section: Related Work Of Evolutionary Multi-objective Rbf Networkmentioning
confidence: 99%
“…This trade-off is a well-known problem as the Multiobjective Optimization Problem (MOP) [49,50,51,52]. This paper applies NSGA II (Non-dominated Sorting Genetic Algorithm) proposed by Deb et al (2002) to solve this problem, as it has recently been frequently applied to various scenarios [53,54,55,56]. On the other hand, for (near) optimal estimation and adjustment of two others RBF parameters (units' widths and output weights), we implement Particle Swarm Optimization (PSO) that favors global and local search of its interacting particles which has proved to be effective in finding the optimum in a search space [57,58,59].…”
Section: Introductionmentioning
confidence: 99%
“…This trade-off is a well-known problem as the Multiobjective Optimization Problem (MOP) [150,169,170,171]. This dissertation applies NSGA II (Non-dominated Sorting Genetic Algorithm) proposed by Deb et al (2002) to solve this problem, as it has recently been frequently applied to various scenarios [172,173,174,175]. On the other hand, for (near) optimal estimation and adjustment of two others RBF parameters (units' widths and output weights), we implement Particle Swarm Optimization (PSO) that favors global and local search of its interacting particles which has proved to be effective in finding the optimum in a search space [117,118,176].…”
Section: Methods For Mitigating Dos Attacksmentioning
confidence: 99%