2009
DOI: 10.1016/j.neucom.2008.09.020
|View full text |Cite
|
Sign up to set email alerts
|

Combined projection and kernel basis functions for classification in evolutionary neural networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
13
0

Year Published

2010
2010
2015
2015

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 45 publications
(14 citation statements)
references
References 40 publications
1
13
0
Order By: Relevance
“…For the Heart dataset, the best training and test error obtained were 95.8% and 86.8%, respectively. The Heart result is comparable to the best result obtained in [8], using sigmoidal-unit networks, where the mean accuracy was 86.9%. The best a-NDM 1 network to emerge interestingly only makes use of the max weight function and the identity node function, ignores some of the input nodes and only requires 2 iterations to compute (see Fig.…”
Section: Resultssupporting
confidence: 77%
See 2 more Smart Citations
“…For the Heart dataset, the best training and test error obtained were 95.8% and 86.8%, respectively. The Heart result is comparable to the best result obtained in [8], using sigmoidal-unit networks, where the mean accuracy was 86.9%. The best a-NDM 1 network to emerge interestingly only makes use of the max weight function and the identity node function, ignores some of the input nodes and only requires 2 iterations to compute (see Fig.…”
Section: Resultssupporting
confidence: 77%
“…In other words, solution vectors contain information on both the architecture and weights of the hybrid neural networks. Examples of evolutionary techniques applied to the simultaneous optimization of neural architectures and weights include [22,23,8]. For a recent survey on the application of evolutionary techniques to the optimization of neural networks see [24].…”
Section: Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…In this context, a recent work presenting an evolutionary algorithm evolving one hidden layer neural network, which contains both kernel functions (RBFs) and projection ones (PU) has been proposed [29]. This hybrid structure has emerged as an excellent alternative, concluding that the combination of RBF and PU offers a very competitive performance.…”
Section: Neural Network Structure and Functional Formmentioning
confidence: 99%
“…It can be anticipated that the accuracy of the hybrid RBF and PU model is better than that of pure models, as previous works have found [29]. Although a model based on a pure RBF layer has been implemented, results are not included in this paper because they are worse than the use of a response surface model.…”
Section: Evolutionary Algorithm (Ea)mentioning
confidence: 99%