1991
DOI: 10.1016/0893-6080(91)90032-z
|View full text |Cite
|
Sign up to set email alerts
|

Back-propagation algorithm which varies the number of hidden units

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
124
0
7

Year Published

1993
1993
2016
2016

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 435 publications
(132 citation statements)
references
References 1 publication
1
124
0
7
Order By: Relevance
“…For example, a neuron can be automatically created or removed from the hidden layer according to some conditions. For example, the threshold value can be compared to the training error rate [3,20], or a set of rules can be created based on the error values [6]. The result of the comparison or the satisfaction of the rules can be used to adjust the number of neurons.…”
Section: Qmentioning
confidence: 99%
“…For example, a neuron can be automatically created or removed from the hidden layer according to some conditions. For example, the threshold value can be compared to the training error rate [3,20], or a set of rules can be created based on the error values [6]. The result of the comparison or the satisfaction of the rules can be used to adjust the number of neurons.…”
Section: Qmentioning
confidence: 99%
“…11, 16,17 Com base nos limites estabelecidos pela Portaria nº 309 da ANP, 2 foi testada a aplicação de redes neurais artificiais do tipo perceptron de múltiplas camadas, para a classificação de gasolinas em A, C e C adulterada e, ainda com base nas respostas da rede, estabelecer as características das gasolinas comercializadas na região de Londrina.…”
unclassified
“…Back-propagation (BP) [5] is the most popular ANN training algorithm. It is a gradient descent method that aims at minimizing the total root mean squared error (RMSE) between actual and desired outputs of a network by employing gradient information.…”
Section: Introductionmentioning
confidence: 99%