2022
DOI: 10.20944/preprints202209.0404.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Forecasting Energy Consumption Time Series Using Recurrent Neural Network in Tensorflow

Abstract: The environmental issues we are currently facing require long-term prospective efforts for sustainable growth. Renewable energy sources seem to be one of the most practical and efficient alternatives in this regard. Understanding a nation's pattern of energy use and renewable energy production is crucial for developing strategic plans. No previous study has been performed to explore the dynamics of power consumption with the change in renewable energy production on a country-wide scale. In contrast, a number o… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
10
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 17 publications
(18 citation statements)
references
References 44 publications
2
10
0
Order By: Relevance
“…In terms of the selected performance indices, DNN models outperformed the SRC method in simulating the daily SSC. These findings corroborate researchers' claims, such as Cigizoglu and Kisi, 2005, about using the feed-forward back propagation (FFBP) algorithm, sediment concentration, and streamflow data as inputs for daily or monthly SSC estimation and forecasting [40][41][42]. However, it has been reported that not enough studies have been conducted to evaluate the best performance of training algorithms within DNN modeling techniques [15].…”
Section: Introductionsupporting
confidence: 83%
See 1 more Smart Citation
“…In terms of the selected performance indices, DNN models outperformed the SRC method in simulating the daily SSC. These findings corroborate researchers' claims, such as Cigizoglu and Kisi, 2005, about using the feed-forward back propagation (FFBP) algorithm, sediment concentration, and streamflow data as inputs for daily or monthly SSC estimation and forecasting [40][41][42]. However, it has been reported that not enough studies have been conducted to evaluate the best performance of training algorithms within DNN modeling techniques [15].…”
Section: Introductionsupporting
confidence: 83%
“…Because of the decaying error backflow, recurrent backpropagation requires a significant amount of computational time and effort to learn to store long-term information. Hence, the concept of the vanishing gradient problem in recognizing long-term dependency of Recurrent Neural Network (RNN) was introduced [42,46]. The main element of processing and retaining long-term information is LSTM feedback connections, and this characteristic distinguishes it from the conventional feedforward neural network.…”
Section: Long Short-term Memory (Lstm) Recurrent Neuralmentioning
confidence: 99%
“…The initial model that was trained on historical data was a multiple linear regression and the second was a long short-term memory network. Both multiple linear regression and long short-term memory have been used by previous researchers as data driven models to predict streamflow, water table depth, and urban flooding; however, there is a need for further investigation on how these techniques can be used to optimize RTC systems for an individual SCM using rainfall data, which is commonly available [36][37][38][39][40][41][42][43]. An exploratory 3 of 18 data analysis approach was used to analyze the four years of historical data at the research site to develop a program to optimize basin performance.…”
Section: Of 18mentioning
confidence: 99%
“…Because of the decaying error backflow, recurrent backpropagation requires a significant amount of computational time and effort to learn to store long-term information. Hence, the concept of the vanishing gradient problem in recognizing long-term dependency of Recurrent Neural Network (RNN) was introduced [49,53]. The main element of processing and retaining long-term information is LSTM feedback connections, and this characteristic distinguishes it from the conventional feedforward neural network.…”
Section: Long Short-term Memory (Lstm) Recurrent Neuralmentioning
confidence: 99%