2008 IEEE International Joint Conference on Neural Networks (IEEE World Congress on Computational Intelligence) 2008
DOI: 10.1109/ijcnn.2008.4634332
|View full text |Cite
|
Sign up to set email alerts
|

A method to resolve the overfitting problem in recurrent neural networks for prediction of complex systems’ behavior

Abstract: In this paper a new method to resolve the overfitting problem for predicting complex systems' behavior has been proposed. This problem occurs when a neural network loses its generalization. The method is based on the training of recurrent neural networks and using simulated annealing for the optimization of their generalization. The major work is done based on the idea of ensemble neural networks. Finally the results of using this method on two sample datasets are presented and the effectiveness of this method… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2008
2008
2023
2023

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(5 citation statements)
references
References 8 publications
0
5
0
Order By: Relevance
“…In this case, a copy of the information flowing from input to output is diverted back in the hidden layers. ENN was designed for voice processing problems (Li et al, 2019) and is similar to the FFNN except for the addition of the context layer (Tampelini, Boscarioli, Peres, & Sampaio, 2011) which stores a copy of the information to be provided to the hidden layers in the subsequent calculation steps (Mahdaviani, Mazyar, Majidi, & Saraee, 2008). Each hidden layers have its own context layer with the number of nodes equal to the number of nodes in the corresponding hidden layer.…”
Section: Fig 1 Morecambe Bay Model Domain and Bathymetry With Observa...mentioning
confidence: 99%
“…In this case, a copy of the information flowing from input to output is diverted back in the hidden layers. ENN was designed for voice processing problems (Li et al, 2019) and is similar to the FFNN except for the addition of the context layer (Tampelini, Boscarioli, Peres, & Sampaio, 2011) which stores a copy of the information to be provided to the hidden layers in the subsequent calculation steps (Mahdaviani, Mazyar, Majidi, & Saraee, 2008). Each hidden layers have its own context layer with the number of nodes equal to the number of nodes in the corresponding hidden layer.…”
Section: Fig 1 Morecambe Bay Model Domain and Bathymetry With Observa...mentioning
confidence: 99%
“…The results suggest that smaller neural structures outperform the complex ones in fast fading channels equalization problem. These results may be explained by the overfitting problem in bigger structures [13,14], which present a greater number of parameters than the necessary. The fit of the parameters to track the signal, which vary rapidly, is a hard work in bigger structures, like the DFE-EKF or DFE-UKF.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…The network is trained (with possible overfitting) and processed afterwards. Such techniques include pruning, weight sharing, weight decay, ensemble neural networks, and complexity regularization [35,37,38]. Pruning is the process of eliminating nodes and connections from the trained network.…”
Section: R Andoniementioning
confidence: 99%