2021
DOI: 10.1002/qj.4177
|View full text |Cite
|
Sign up to set email alerts
|

A new approach to extended‐range multimodel forecasting: Sequential learning algorithms

Abstract: Multimodel combinations are a well-established methodology in weather and climate prediction and their benefits have been widely discussed in the literature. Typical approaches involve combining the output of different numerical weather prediction (NWP) models using constant weighting factors, either uniformly distributed or determined through a prior skill assessment. This strategy, however, can lead to suboptimal levels of skill, as the performance of NWP models can vary with time (e.g., seasonally varying s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 8 publications
(7 citation statements)
references
References 66 publications
1
6
0
Order By: Relevance
“…Hence, past performance leads, at times, to a suboptimal model combination for forecasting purposes. This finding complements previous studies that conducted experiments in more stable market conditions [4], [42], [44], [46]. As discussed in the previous Section, a further drawback of the Combined forecasts is that weather forecasts are a shared source of errors, thus resulting in error correlation between forecasts.…”
Section: Heterogenous Conditionssupporting
confidence: 85%
“…Hence, past performance leads, at times, to a suboptimal model combination for forecasting purposes. This finding complements previous studies that conducted experiments in more stable market conditions [4], [42], [44], [46]. As discussed in the previous Section, a further drawback of the Combined forecasts is that weather forecasts are a shared source of errors, thus resulting in error correlation between forecasts.…”
Section: Heterogenous Conditionssupporting
confidence: 85%
“…Sengupta et al (2020) used a Bayesian neural network to infer model weights. Sequential aggregation takes inspiration from online learning and game theory in weighting forecasts with rules that have theoretical performance guarantees (Gonzalez et al, 2021;Mallet et al, 2009;Thorey et al, 2017). Forecast weights can also be determined using Markov chain Monte Carlo (Dumont Le Brazidec et al, 2021).…”
Section: Weighting Distinct Forecastsmentioning
confidence: 99%
“…The BOA will lead always to a convex combination of the forecasters, as the EWA. Further, is well known that the EWA in combination with the gradient trick can achieve optimal convergence rates, if the considered updating loss is exp-concave, see [29], [30]. Unfortunately, the required absolute deviation AD is not expconcave.…”
Section: A Formal Description Of the Algorithmmentioning
confidence: 99%