2020 5th International Conference on Green Technology and Sustainable Development (GTSD) 2020
DOI: 10.1109/gtsd50082.2020.9303164
|View full text |Cite
|
Sign up to set email alerts
|

Electricity Demand Forecasting for Smart Grid Based on Deep Learning Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…The study conducted has revealed that the most widely used Deep Learning models in the energy domain for demand forecasting purposes are CNNs, RNNs, LSTM, DQNs, and CRBM and a variation of any of them, a combination of two or more of them, or the combination of any of them with other techniques. Notable are CNN and its variations such as Pyramid-CNN [ 82 , 85 , 88 , 90 , 91 , 94 , 95 , 101 , 106 , 107 , 109 , 115 , 118 , 119 , 123 ], LSTM and its variations such as B-LSTM [ 80 , 82 , 86 , 87 , 88 , 91 , 93 , 94 , 95 , 99 , 100 , 103 , 104 , 106 , 107 , 109 , 110 , 111 , 112 , 113 , 118 , 119 , 122 ], and a combination of both [ 82 , 88 , 91 , 94 , 95 , 106 , 107 , 109 , 118 , 119 ]. Real testbeds with high-quality data are not common, but are necessary to determine the performance of Deep Leaning models.…”
Section: Discussionmentioning
confidence: 99%
“…The study conducted has revealed that the most widely used Deep Learning models in the energy domain for demand forecasting purposes are CNNs, RNNs, LSTM, DQNs, and CRBM and a variation of any of them, a combination of two or more of them, or the combination of any of them with other techniques. Notable are CNN and its variations such as Pyramid-CNN [ 82 , 85 , 88 , 90 , 91 , 94 , 95 , 101 , 106 , 107 , 109 , 115 , 118 , 119 , 123 ], LSTM and its variations such as B-LSTM [ 80 , 82 , 86 , 87 , 88 , 91 , 93 , 94 , 95 , 99 , 100 , 103 , 104 , 106 , 107 , 109 , 110 , 111 , 112 , 113 , 118 , 119 , 122 ], and a combination of both [ 82 , 88 , 91 , 94 , 95 , 106 , 107 , 109 , 118 , 119 ]. Real testbeds with high-quality data are not common, but are necessary to determine the performance of Deep Leaning models.…”
Section: Discussionmentioning
confidence: 99%
“…RNN count the present input and the output acquired from the last input while it makes logical decision. RNNs can utilize the internal state (memory) to evaluate input variables, which is different from feedforward neural network [31,32]. In recurrent neural network, all of the inputs are connected with one another, which distinguishes it from the other neural networks.…”
Section: Gated Recurrent Unit (Gru)mentioning
confidence: 99%