2016
DOI: 10.1016/j.neucom.2016.08.072
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic selection of forecast combiners

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 23 publications
(10 citation statements)
references
References 49 publications
0
10
0
Order By: Relevance
“…However, studies like Neumann's [23], comment that adopting a single model can lead to statistical bias and underestimating the real uncertainty underlying the time series. In this way, authors [24]- [30] have been challenged by combining diverse models to present aggregate estimates.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, studies like Neumann's [23], comment that adopting a single model can lead to statistical bias and underestimating the real uncertainty underlying the time series. In this way, authors [24]- [30] have been challenged by combining diverse models to present aggregate estimates.…”
Section: Introductionmentioning
confidence: 99%
“…In this context, one of the problems has been to combine single predictors of the time series to enhance the forecasting performance. Statistically, these ensembles are generally superior in comparison with individual models in terms of both accuracy and efficiency [24], [31]- [34]. Also, among ANN-based approaches, there are ensembles that combine models to obtain a more statistically robust system, outperforming the single forecasting models [12], [29], [35], [36].…”
Section: Introductionmentioning
confidence: 99%
“…Fourth, the posterior probabilities of the best forecaster model (model weights) are computed using the normalized exponential (Softmax) function with and . The Softmax function is a generalization of the logistic function that is often employed in classification and forecasting exercises using traditional machine learning and deep learning methods as a combiner or activation function [38] . The function assigns larger weights to models with smaller forecasting errors, with the weights decaying exponentially the larger the error ( Table 1 – lines 11–13).…”
Section: Methodsmentioning
confidence: 99%
“…The Softmax function is a generalization of the logistic function that transforms a K -dimensional vector of arbitrary real values to a k -dimensional vector of real values in the range that adds up to one. The Softmax function is often used in classification and forecasting exercises using traditional, machine learning and deep learning methods as a combiner or an activation function [ 63 ]. The function assigns larger weights to models with smaller forecasting error, with the weights decaying exponentially the larger the error.…”
Section: Methodsmentioning
confidence: 99%