2022
DOI: 10.3390/en16010276
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced Machine-Learning Techniques for Medium-Term and Short-Term Electric-Load Forecasting in Smart Grids

Abstract: Nowadays, electric load forecasting through a data analytic approach has become one of the most active and emerging research areas. It provides future consumption patterns of electric load. Since there are large fluctuations in both electricity production and use, it is a difficult task to achieve a balance between electric load and demand. By analyzing past electric consumption records to estimate the upcoming electricity load, the issue of fluctuating behavior can be resolved. In this study, a framework for … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 16 publications
(7 citation statements)
references
References 47 publications
0
7
0
Order By: Relevance
“…Modifications architectural are made by adding other feature selection methods at the RFE stage and calculating the latest ranking. The list of feature selection methods used in RFE modification includes RF-RFE [18], [19], [22]; XGB-RFE [19]; SVM-RFE [18], [20], [21], [23]; GBM-RFE [18]; Absolute Cosine [23]; KPCA [21]; dan MI [20]. To combine or figure out ranking criteria between them, you can add weights with a threshold [19]; add weights simply [18], [22]; add weights based on multiplying weights and accuracy [18]; sum with the mRMR method [23]; take the weighted average [21]; and recalculation of the smallest features and input features using the MICBC approach [20].…”
Section: Rq 5: Modification and Development Strategy Architecture Rec...mentioning
confidence: 99%
See 1 more Smart Citation
“…Modifications architectural are made by adding other feature selection methods at the RFE stage and calculating the latest ranking. The list of feature selection methods used in RFE modification includes RF-RFE [18], [19], [22]; XGB-RFE [19]; SVM-RFE [18], [20], [21], [23]; GBM-RFE [18]; Absolute Cosine [23]; KPCA [21]; dan MI [20]. To combine or figure out ranking criteria between them, you can add weights with a threshold [19]; add weights simply [18], [22]; add weights based on multiplying weights and accuracy [18]; sum with the mRMR method [23]; take the weighted average [21]; and recalculation of the smallest features and input features using the MICBC approach [20].…”
Section: Rq 5: Modification and Development Strategy Architecture Rec...mentioning
confidence: 99%
“…The list of feature selection methods used in RFE modification includes RF-RFE [18], [19], [22]; XGB-RFE [19]; SVM-RFE [18], [20], [21], [23]; GBM-RFE [18]; Absolute Cosine [23]; KPCA [21]; dan MI [20]. To combine or figure out ranking criteria between them, you can add weights with a threshold [19]; add weights simply [18], [22]; add weights based on multiplying weights and accuracy [18]; sum with the mRMR method [23]; take the weighted average [21]; and recalculation of the smallest features and input features using the MICBC approach [20]. This provides the opportunity that there are other ways to calculate feature combinations, such as merid values based on correlation, averages based on maximum weight, and so on.…”
Section: Rq 5: Modification and Development Strategy Architecture Rec...mentioning
confidence: 99%
“…Khan et al [15] presented enhanced machine learning models that were based on SVM and CNN to support electricity forecasting in smart grids, using a big data set from ISO-NE. A feature selection and extraction scheme were implemented to reduce the computational complexity.…”
Section: Literature Review and Research Gap Analysismentioning
confidence: 99%
“…Haydar et al [ 6 ] proposed a framework for energy load anticipation that involved selection of attribute, extraction, as well as regression. XGB (Extreme Gradient Boosting) and RF (Random Forest) were used in the attribute selection phase to ascertain the value of each attribute.…”
Section: Introductionmentioning
confidence: 99%