2019
DOI: 10.1007/978-3-030-31362-3_21
|View full text |Cite
|
Sign up to set email alerts
|

The Comparison of Machine-Learning Methods XGBoost and LightGBM to Predict Energy Development

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1
1

Relationship

0
10

Authors

Journals

citations
Cited by 12 publications
(6 citation statements)
references
References 8 publications
0
5
0
Order By: Relevance
“…LightGBM [18] also belongs to the gradient tree boosting models, in which a decision tree is split in leaf-wise with the best fit, thus reducing the loss with better accuracy. Similarly, XGBoost and LightGBM models have been used to predict the thermal power energy development [19] and later showed less Mean Absolute Percentage Error (MAPE%) on their dataset.…”
Section: Related Workmentioning
confidence: 99%
“…LightGBM [18] also belongs to the gradient tree boosting models, in which a decision tree is split in leaf-wise with the best fit, thus reducing the loss with better accuracy. Similarly, XGBoost and LightGBM models have been used to predict the thermal power energy development [19] and later showed less Mean Absolute Percentage Error (MAPE%) on their dataset.…”
Section: Related Workmentioning
confidence: 99%
“…This combination leads to nonlinearity. It is confirmed that neural networks have ability to solve nonlinearity, and their implementation is successfully proven in many cases and described in many publications [5][6][7].…”
Section: Introductionmentioning
confidence: 72%
“…Besides, RF model might underestimated/overestimated the extremely values of ambient benzene (Xue et al, 2019), which could be neutralized by the XGBoost algorithm through the boosting method (Li et al, 2020). For XGBoost algorithm, excessive leaf nodes often showed low splitting gain, while the LightGBM model could make up this defect (Nemeth et al, 2019). Overall, the combination of these decision tree models could overcome these weaknesses of these individual models and enhance the robustness of the final model.…”
Section: The Model Fitting and Validationmentioning
confidence: 99%