2006
DOI: 10.1016/j.jeconom.2005.07.015
|View full text |Cite
|
Sign up to set email alerts
|

Persistence in forecasting performance and conditional combination strategies

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
236
0
2

Year Published

2008
2008
2020
2020

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 293 publications
(243 citation statements)
references
References 17 publications
5
236
0
2
Order By: Relevance
“…As in Stock and Watson (2004), we use δ = 1.0 (no discounting of past forecasts) and δ = 0.9 (more weight on most recent forecasts) resulting in Stock and Watson's discounted mean-square forecast error (DMSFE) methods DMSFE(1) and DMSFE(0.9). Finally, we use cluster combinations to control for forecast persistence (see Aiolfi and Timmermann, 2006). Utilizing a hold-out period, the individual ARDL models are ranked by their mean square forecast error (MSFE) and clusters are created by consecutively adding the next lowest MSFE ARDL model to the cluster.…”
Section: Combination Predictor Modelsmentioning
confidence: 99%
“…As in Stock and Watson (2004), we use δ = 1.0 (no discounting of past forecasts) and δ = 0.9 (more weight on most recent forecasts) resulting in Stock and Watson's discounted mean-square forecast error (DMSFE) methods DMSFE(1) and DMSFE(0.9). Finally, we use cluster combinations to control for forecast persistence (see Aiolfi and Timmermann, 2006). Utilizing a hold-out period, the individual ARDL models are ranked by their mean square forecast error (MSFE) and clusters are created by consecutively adding the next lowest MSFE ARDL model to the cluster.…”
Section: Combination Predictor Modelsmentioning
confidence: 99%
“…Garratt et al (2009 also propose to combine the nowcast from a large number of models in a two-step procedure. 10 Our approach is close to Aiolfi and Timmermann (2006) in the sense that we combine models in more than one stage. They find that forecasting performance can be improved by first sorting models into clusters based on their past performance, second by pooling forecasts within each cluster, and third by estimating optimal weights on these clusters (followed by shrinkage towards equal weights).…”
Section: Forecast Frameworkmentioning
confidence: 96%
“…Cluster combinations or combining, as developed by Aiolfi and Timmermann (2006), is a conditional combining approach which incorporates information about the forecast persistence and historical performance of individual models (Rapach & Strauss, 2010). In this study, we specifically employ the Previous Best conditional combination strategy.…”
Section: Cluster Combinationsmentioning
confidence: 99%