[1993] Proceedings of the Second International Forum on Applications of Neural Networks to Power Systems
DOI: 10.1109/ann.1993.264286
|View full text |Cite
|
Sign up to set email alerts
|

Short-term load forecasting using diagonal recurrent neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(4 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…The most significant aspects of the model fall into the following two areas: training process and selection of the input variables. At the same time Lee et al [11] presented a diagonal recurrent ANN with an adaptive learning rate. In 1993, Chaudhary et al [4] presented a fast & accurate method of STLF using combinations of self-organising maps (SOM) & multi-layer perception model.…”
Section: Modification Of W Rmentioning
confidence: 98%
“…The most significant aspects of the model fall into the following two areas: training process and selection of the input variables. At the same time Lee et al [11] presented a diagonal recurrent ANN with an adaptive learning rate. In 1993, Chaudhary et al [4] presented a fast & accurate method of STLF using combinations of self-organising maps (SOM) & multi-layer perception model.…”
Section: Modification Of W Rmentioning
confidence: 98%
“…This RNN seems difficult to use for forecasts of large horizon because of stability problems. In [23,24] each neuron is only fed back to himself. A comparison between a feed forward technique and this recurrent network is also made [24].…”
Section: Single Multi-input Single-output Models (Miso)mentioning
confidence: 99%
“…Recurrent Neural Network is widely used in forecasting these years [5] and it can well done the non-stationary time series prediction, providing satisfactory results [6]. In the RNN modeling, all available indicators can be used as the inputs of RNN, but irrelevant or correlated features could deteriorate the generalization performance of RNN due to the "curse of dimensionality" problem [7].…”
Section: Introductionmentioning
confidence: 99%