2021 International Joint Conference on Neural Networks (IJCNN) 2021
DOI: 10.1109/ijcnn52387.2021.9533378
|View full text |Cite
|
Sign up to set email alerts
|

A study on Ensemble Learning for Time Series Forecasting and the need for Meta-Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 18 publications
(5 citation statements)
references
References 17 publications
0
5
0
Order By: Relevance
“…providing better predictions, as verified by all the reported measures. The fact that the fused method (ensemble) performs better than the individual methods is a recurring pattern in the ensemble learning literature [54]. Combining different predictors increases the confidence in the predictions and reduces the error.…”
Section: Plos Onementioning
confidence: 99%
“…providing better predictions, as verified by all the reported measures. The fact that the fused method (ensemble) performs better than the individual methods is a recurring pattern in the ensemble learning literature [54]. Combining different predictors increases the confidence in the predictions and reduces the error.…”
Section: Plos Onementioning
confidence: 99%
“…For a similar forecasting challenge, [46] combined four base models-random forests, long short-term memory (LSTM), deep neural networks, and evolutionary trees-with gradient boosting and extreme gradient boosting as metalearners, resulting in a significant reduction in forecast error. In [20], the authors conducted a comparative analysis of various forecast combination strategies, including simple averaging, linear combination with weights based on performance, FFORMA, and stacking. Experimental results involving 16,000 time series from diverse sources demonstrated that stacking outperforms its counterparts, underscoring its efficacy for time series with varying characteristics.…”
Section: Combining Forecastsmentioning
confidence: 99%
“…When dealing with constituent forecasts derived from nonlinear models or situations where the true relationship between combination members and the best forecast involves nonlinear systems, ML models can be harnessed to nonlinearly combine the base forecasts through a stacking procedure [5,20]. Stacking, in this context, can enhance forecast accuracy by learning the optimal combination of constituent forecasts in a data-driven manner.…”
Section: Combining Forecastsmentioning
confidence: 99%
“…Neural networks (NNs) are often used in stacking to estimate the nonlinear mapping between the target value and its forecasts produced by multiple models [11]. The power of ensemble learning for forecasting was demonstrated in [12], where several meta-learning approaches were evaluated on a large and diverse set of time series data. Ensemble methods were found to provide a benefit in overall forecasting accuracy, with simple ensemble methods leading to good results on average.…”
Section: Introductionmentioning
confidence: 99%