2019
DOI: 10.3390/en12173308
|View full text |Cite
|
Sign up to set email alerts
|

Short-Term Load Forecasting for CCHP Systems Considering the Correlation between Heating, Gas and Electrical Loads Based on Deep Learning

Abstract: Combined cooling, heating, and power (CCHP) systems is a distributed energy system that uses the power station or heat engine to generate electricity and useful heat simultaneously. Due to its wide range of advantages including efficiency, ecological, and financial, the CCHP will be the main direction of the integrated system. The accurate prediction of heating, gas, and electrical loads plays an essential role in energy management in CCHP systems. This paper combined long short-term memory (LSTM) network and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
2

Relationship

0
10

Authors

Journals

citations
Cited by 36 publications
(15 citation statements)
references
References 45 publications
0
11
0
Order By: Relevance
“…To reduce overfitting, ensemble learning was employed to create independent predictions of multiple models and to use weighted averaged results [59,[95][96][97]99,113,115,[129][130][131][132][133][134]. Other measures against overfitting include the usage of incremental learning and dynamic neural networks, where the models are updated step by step during training phase [88,106,131,135] or restrictions on coefficients are implemented [136] as well as the introduction of dropout layers [137]. The adjustments of coefficients of the predicting variables in order to capture the essential properties of the training data and provide better generalization to yet unknown data points, is an important and widespread concept to avoid overfitting known as regularization.…”
Section: Measures For Improvement Of Accuracymentioning
confidence: 99%
“…To reduce overfitting, ensemble learning was employed to create independent predictions of multiple models and to use weighted averaged results [59,[95][96][97]99,113,115,[129][130][131][132][133][134]. Other measures against overfitting include the usage of incremental learning and dynamic neural networks, where the models are updated step by step during training phase [88,106,131,135] or restrictions on coefficients are implemented [136] as well as the introduction of dropout layers [137]. The adjustments of coefficients of the predicting variables in order to capture the essential properties of the training data and provide better generalization to yet unknown data points, is an important and widespread concept to avoid overfitting known as regularization.…”
Section: Measures For Improvement Of Accuracymentioning
confidence: 99%
“…LSTM and GRU are techniques built upon RNN. LSTM was the most used ANN approach in load forecasting, demonstrating better performance in different complex situations such as lowhigh-frequency components [53], probabilistic load forecasting [55], PV generation [66], CCHP (Combined Cooling, Heating, and Power) system [81], Hybrid energy systems, [87], etc. A combination of three ANNs, namely LSTM, BPNN, and DQN (Deep Q-Network), have been used for establishing a similar day selection based STLF model [40].…”
Section: B Different Ann Techniques In Deep Learning Based Load Forementioning
confidence: 99%
“…CNN combined with attention block is used to extract the effective characteristics of load influence factors to forecast the next hour's load. [25] uses the Pearson correlation coefficient to measure the time correlation between the current load and the historical load, analyze the coupling relationship between the heat, gas, and electric load, and forecast the electric cooling and heating loads of the CCHP system. Another method that also plays a positive role is based on MTL (Multi-Task Learning) forecasting, which processes information of different types of energy loads through MTL's sharing mechanism [26] and achieves more effective forecasting results than single-task learning.…”
Section: Introductionmentioning
confidence: 99%