2021
DOI: 10.1007/s00521-021-05995-8
|View full text |Cite
|
Sign up to set email alerts
|

Boosting algorithms in energy research: a systematic review

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
24
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

4
4

Authors

Journals

citations
Cited by 41 publications
(24 citation statements)
references
References 105 publications
0
24
0
Order By: Relevance
“…The experiments conducted herein were solely based on random forests. Nonetheless, more regression algorithms (see, e.g., those listed and documented by Hastie et al [56] and James et al [57], as well as those investigated in other contexts in hydrometeorology by Tyralis et al [58] and Zhang and Ye [59]) are worthy of investigation together with a variety of time series features in streamflow regionalization contexts, with boosting being an appealing option among them because of its theoretical properties (see, e.g., [60], Section 3) and its relevance to determining feature importance within explainable machine learning settings. For these same reasons, boosting algorithms were previously proposed and extensively investigated by Tyralis et al [16] for predicting various hydrological signatures probabilistically.…”
Section: Discussionmentioning
confidence: 99%
“…The experiments conducted herein were solely based on random forests. Nonetheless, more regression algorithms (see, e.g., those listed and documented by Hastie et al [56] and James et al [57], as well as those investigated in other contexts in hydrometeorology by Tyralis et al [58] and Zhang and Ye [59]) are worthy of investigation together with a variety of time series features in streamflow regionalization contexts, with boosting being an appealing option among them because of its theoretical properties (see, e.g., [60], Section 3) and its relevance to determining feature importance within explainable machine learning settings. For these same reasons, boosting algorithms were previously proposed and extensively investigated by Tyralis et al [16] for predicting various hydrological signatures probabilistically.…”
Section: Discussionmentioning
confidence: 99%
“…Probably due to the presence of shifts (and perhaps trends) in the urban water demand time series, the linear boosting algorithm has been found to be the best‐performing quartile regression algorithm in solving probabilistic one‐day ahead urban water demand forecasting problems. Boosting algorithms are known for their ability of “garnering wisdom from a council of fools” (Tyralis & Papacharalampous, 2021), and hold their own special place among the machine and statistical learning algorithms. They have also been shown to perform well in probabilistic energy demand forecasting (see the results by Taieb et al., 2016).…”
Section: Discussionmentioning
confidence: 99%
“…As the boosting and generalized random forest algorithms are not affected by less important predictors by construction (see their property description, e.g., in Tyralis, Papacharalampous, & Langousis, 2019, and Tyralis & Papacharalampous, 2021), we have considered the entire set of endogenous and exogenous predictors herein. Nonetheless, to reduce the computational requirements (which are increased with increasing the number of predictors), one could prefer the use of fewer predictors even when algorithms with such good properties (in terms of predictors) are selected.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Statistical modelling based on the quantile loss function is frequently covered in the statistical literature and is well received by practitioners, e.g., in the optimisation of linearin-parameters models (e.g., see the book by [53]), neural networks [54], random forests (e.g., see the review by [55]) and boosting algorithms (e.g., see the review by [56]). Therefore, we believe that it will also be of practical interest to hydrologists.…”
Section: Introductionmentioning
confidence: 99%