1999
DOI: 10.7551/mitpress/4937.001.0001
|View full text |Cite
|
Sign up to set email alerts
|

Neural Smithing

Abstract: Artificial neural networks are nonlinear mapping systems whose structure is loosely based on principles observed in the nervous systems of humans and animals. The basic idea is that massive systems of simple units linked together in appropriate ways can generate many complex and interesting behaviors. This book focuses on the subset of feedforward artificial neural networks called multilayer perceptrons (MLP). These are the mostly widely used neural networks, with applications as diverse as finance (forecastin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
133
0
1

Year Published

2003
2003
2014
2014

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 486 publications
(135 citation statements)
references
References 0 publications
1
133
0
1
Order By: Relevance
“…Methods such as Bayesian regularization, early stopping, etc. are commonly used to improve the generalization in neural networks [37]. In this study, the Levenberg-Marquardt method is used together with Bayesian regularization in training neural networks in order to obtain neural networks with good generalization capability.…”
Section: Predictive Neural Network Modeling Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…Methods such as Bayesian regularization, early stopping, etc. are commonly used to improve the generalization in neural networks [37]. In this study, the Levenberg-Marquardt method is used together with Bayesian regularization in training neural networks in order to obtain neural networks with good generalization capability.…”
Section: Predictive Neural Network Modeling Algorithmmentioning
confidence: 99%
“…In this study, the Levenberg-Marquardt method is used together with Bayesian regularization in training neural networks in order to obtain neural networks with good generalization capability. The details of Levenberg-Marquardt algorithm can be found in [37].…”
Section: Predictive Neural Network Modeling Algorithmmentioning
confidence: 99%
“…The representation ability of a neural network depends on the number of layers, on the number of neurons per layer and on the connectivity between layers. It was demonstrated that a network with two hidden layers can solve any pattern classification problem [65,6,94]. On the other hand, several authors have proved that any function approximation problem can be solved by using one hidden layer [60,38,37,23].…”
Section: Single-population Methods For Mlp Optimization (G-prop)mentioning
confidence: 99%
“…Now, however, the dilemma of overfitting arises: small networks generalize well, but are slow at learning, whereas big networks learn fast (needing fewer training epochs to obtain similar precision), but generalize badly [5,4]. The way to obtain good generalization ability is to use the smallest network that can learn the input data efficiently [93,94,103]. For this reason, this operator is combined with the following one.…”
Section: Single-population Methods For Mlp Optimization (G-prop)mentioning
confidence: 99%
See 1 more Smart Citation