2006
DOI: 10.1007/11759966_126
|View full text |Cite
|
Sign up to set email alerts
|

Estimating the Number of Hidden Neurons in a Feedforward Network Using the Singular Value Decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2009
2009
2018
2018

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(15 citation statements)
references
References 12 publications
0
15
0
Order By: Relevance
“…Different training errors can be obtained by BP neural network training. Among them, we let different numbers of hidden nodes as the dynamic parameters of BP neural networks [40][41][42] and select the smallest simulation error of the BP neural network as a forecasting model of petroleum projects. Then, we will confirm Fig.…”
Section: The Design Of Bp Neural Networkmentioning
confidence: 99%
“…Different training errors can be obtained by BP neural network training. Among them, we let different numbers of hidden nodes as the dynamic parameters of BP neural networks [40][41][42] and select the smallest simulation error of the BP neural network as a forecasting model of petroleum projects. Then, we will confirm Fig.…”
Section: The Design Of Bp Neural Networkmentioning
confidence: 99%
“…The number of hidden layer units has been shown to be an important parameter for data learning and accurate identification. Teoh et al (2006) attempt to quantify the effect of increasing the number of neurons in the hidden layer in a feedforward neural network using singular value decomposition (SVD). Shuxiang and Ling (2008) reviewed several mechanisms in the neural network literature that have been used to determine the optimal number of hidden layer neurons (given an application), and proposed a new approach based on some mathematical evidence and applied it to financial data mining.…”
Section: Optimization Of the Number Of Neurons In The Hidden Layermentioning
confidence: 99%
“…The number of neurons in the hidden layer is chosen to be the number of input neurons. Optimum selection of the number of hidden layer neurons is explored in [33].…”
Section: Proposed Back-propagation Artificial Neural Network Modelmentioning
confidence: 99%