2003
DOI: 10.2139/ssrn.457360
|View full text |Cite
|
Sign up to set email alerts
|

Forecasting U.S. Inflation by Bayesian Model Averaging

Abstract: Recent empirical work has considered the prediction of inflation by combining the information in a large number of time series. One such method that has been found to give consistently good results consists of simple equal weighted averaging of the forecasts over a large number of different models, each of which is a linear regression model that relates inflation to a single predictor and a lagged dependent variable. In this paper, I consider using Bayesian Model Averaging for pseudo out-of-sample prediction o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
55
0

Year Published

2006
2006
2018
2018

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 37 publications
(55 citation statements)
references
References 25 publications
0
55
0
Order By: Relevance
“…The basic structure of the forecasting models examined is the same as that examined in Artis et al (2002), Bai and Ng (2002, 2006a,b, 2008, 2009), Boivin and Ng (2005) and Stock and Watson (2002, 2005, 2006, 2012. In particular, we consider models of the following generic form:…”
Section: Di¤usion Index Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…The basic structure of the forecasting models examined is the same as that examined in Artis et al (2002), Bai and Ng (2002, 2006a,b, 2008, 2009), Boivin and Ng (2005) and Stock and Watson (2002, 2005, 2006, 2012. In particular, we consider models of the following generic form:…”
Section: Di¤usion Index Modelsmentioning
confidence: 99%
“…2 We refer the reader to Stock and Watson (1999, 2005, 2012 and Bai and Ng (2002, 2008, 2009) for a detailed explanation of this procedure, and to Connor and Korajczyk (1986, 1988, 1993, Forni et al (2005) and Armah and Swanson (2010) for further detailed discussion of factor augmented autoregression models. Finally, note that Ding and Hwang (1999) also analyze the properties of forecasts constructed using principal components when N and T are large, although they carry out their analysis under the assumption that the error processes fe tj ; " t+h g are cross-sectionally and serially iid: 3.1 Statistical Learning (Bagging and Boosting) 3.1.1 Bagging Bagging, which is short for "bootstrap aggregation", was introduced by Breiman (1996).…”
Section: Robust Estimation Techniquesmentioning
confidence: 99%
“…Following Bai and Ng (2002, 2006b, 2008, 2009, the whole panel of data X = (X 1 ; :::; X N ) can be represented as (3). We then estimate the factors, F t , via principal components analysis, independent component analysis, or sparse principal component analysis.…”
Section: Factor Models: Basic Frameworkmentioning
confidence: 99%
“…Koop and Potter (2004) and Wright (2008Wright ( , 2009)) For a concise discussion of BMA, see Hoeting et al (1999) and Chipman et al (2001). The basic idea of BMA starts with supposing that interest focuses on Q possible models, denoted by M 1 ; :::; M Q , say: In forecasting contexts, BMA involves averaging target predictions, Y t+h from the candidate models, with weights appropriately chosen.…”
Section: Bayesian Model Averaging (Bma)mentioning
confidence: 99%
See 1 more Smart Citation