2020
DOI: 10.3390/e22010068
|View full text |Cite
|
Sign up to set email alerts
|

Towards Modified Entropy Mutual Information Feature Selection to Forecast Medium-Term Load Using a Deep Learning Model in Smart Homes

Abstract: Over the last decades, load forecasting is used by power companies to balance energy demand and supply. Among the several load forecasting methods, medium-term load forecasting is necessary for grid’s maintenance planning, settings of electricity prices, and harmonizing energy sharing arrangement. The forecasting of the month ahead electrical loads provides the information required for the interchange of energy among power companies. For accurate load forecasting, this paper proposes a model for medium-term lo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
10
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
4
1

Relationship

3
7

Authors

Journals

citations
Cited by 32 publications
(14 citation statements)
references
References 45 publications
0
10
0
Order By: Relevance
“…Deep learning techniques can effectively accomplish the tasks of image detection, recognition and classi cation, so the introduction of deep learning techniques in the eld of imaging may help radiologists to complete various tasks of detection and diagnosis [19,20]. Pulmonary nodule detection using AI algorithm is an important part of AI medical eld [21].…”
Section: Discussionmentioning
confidence: 99%
“…Deep learning techniques can effectively accomplish the tasks of image detection, recognition and classi cation, so the introduction of deep learning techniques in the eld of imaging may help radiologists to complete various tasks of detection and diagnosis [19,20]. Pulmonary nodule detection using AI algorithm is an important part of AI medical eld [21].…”
Section: Discussionmentioning
confidence: 99%
“…In a few cases, MI-ANN, WNN, GRU, DBN, RBM, ANFIS, and ART network approaches were applied to obtain better performances. Cascade NN, KNN-ANN [44], [48], [63], [65], [66], [73], [74], [75], [96], [109], [124], [129], [131], [133], [134], [137], [ [40], [42], [45] - [65], [67] - [73], [76] - [95], [97], [98], [101] - [103], [105] - [107], [110] - [119], [121], [123] - [126], [130] - [41], [42], [45] - [48], [52], [57], [61] - [63], [65], [69], [70], [72], [76],…”
Section: B Different Ann Techniques In Deep Learning Based Load Forementioning
confidence: 99%
“…There is a problem of weak generalization that occurs when the number of training samples is small, and the number of neurons is large, which may cause overfitting or overparameterization. To solve this problem, fully connected layer is trained by a conditional restricted Boltzmann machine (CRBM) [62]. Hence, the deep CNN architecture is described as follows:…”
Section: F Training Of Deep Convolution Neural Networkmentioning
confidence: 99%