2008
DOI: 10.1016/j.neucom.2008.04.014
|View full text |Cite
|
Sign up to set email alerts
|

Upper bound on pattern storage in feedforward networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 22 publications
0
6
0
Order By: Relevance
“…To evaluate the performance among the two classifiers (SFLN, ELM) using the scaled training data, five statistical indices (RMSE, CO, PE, RSQ, and R 2 ) were calculated for different numbers of hidden nodes for Scenario 1 (note: result figures are not shown here). SLFN's learning capacity is governed bỹ N ≤ N ≤ (1 + n/m)Ñ [55], so the lower bound of the number of hidden nodes is N/(1 + n/m) = 28/(1 + 3/1) = 7 for this experiment. This indicates that SLFN with nine hidden nodes can be used for prediction because its RMSE of training error is small.…”
Section: ) Comparison Of Elm and Slfnmentioning
confidence: 99%
“…To evaluate the performance among the two classifiers (SFLN, ELM) using the scaled training data, five statistical indices (RMSE, CO, PE, RSQ, and R 2 ) were calculated for different numbers of hidden nodes for Scenario 1 (note: result figures are not shown here). SLFN's learning capacity is governed bỹ N ≤ N ≤ (1 + n/m)Ñ [55], so the lower bound of the number of hidden nodes is N/(1 + n/m) = 28/(1 + 3/1) = 7 for this experiment. This indicates that SLFN with nine hidden nodes can be used for prediction because its RMSE of training error is small.…”
Section: ) Comparison Of Elm and Slfnmentioning
confidence: 99%
“…In this context, two scenarios of four include only the detected known teleconnection patterns for monthly and seasonal scale; the other two scenarios contain both known and unknown teleconnection patterns for monthly and seasonal scale. Note that the neural network capacity test associated with hidden layers was applied to check the potential overfitting of the proposed forecasting models (Narasimha et al, 2008;Sheela andDeepa, 2013, Wang et al, 2014).…”
Section: Mapping the Teleconnection Patterns Or Index Regions-pixel-wisementioning
confidence: 99%
“…. , NRL(i)} in (17) are the links with repeated weight A iq for the qth MF of x i . Theorem 2: The gradient components of the F-CONFIS in the premise part should be summed NRL(i) times for fuzzy variable x i .…”
Section: B Special Learning Algorithm For the Fuzzy Neural Network Via A Fully Connected Neural Fuzzy Inference Systemmentioning
confidence: 99%
“…Furthermore, it has been shown in [17] that the upper bound for the capacity of feed-forward networks with arbitrary hidden unit activation functions can be found. For a feed-forward network (without bias) with N inputs, M outputs, L hidden units, and arbitrary hidden activation functions, the number of patterns P that can be memorized with no error is less than or equal to L(N + M), divided by the number of outputs M .…”
Section: Capacity Of Fuzzy Neural Network or Fully Connected Neural Fuzzy Inference Systemmentioning
confidence: 99%
See 1 more Smart Citation