2017
DOI: 10.3982/ecta13372
|View full text |Cite
|
Sign up to set email alerts
|

Forecasting With Model Uncertainty: Representations and Risk Reduction

Abstract: This supplement introduces some alternative procedures to the ones considered in the main text, and provides extended numerical comparisons of local asymptotic risk among the various methods. It also conducts a small Monte Carlo study of finite-sample risk, and provides a comparison of shrinkage factors for a number of the procedures.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(16 citation statements)
references
References 43 publications
0
16
0
Order By: Relevance
“…For bagging λ =1, while various Bayesian predictors, including Bayesian model averaging and empirical Bayes can also be formulated in this manner, by setting λ appropriately. Interestingly, Hirano and Wright () show that forecasting models constructed using out‐of‐sample or split sample schemes perform well only when combined with other methods, such as bagging. Broadly speaking, their results offer a glimpse into the benefits of using state‐of‐the‐art (asymptotic) statistical analysis in order to examine new methods that combine conventional out‐of‐sample approaches to model selection and estimation with algorithmic approaches such as bagging.…”
Section: Dimension Reduction and Penalized Regressionmentioning
confidence: 99%
See 1 more Smart Citation
“…For bagging λ =1, while various Bayesian predictors, including Bayesian model averaging and empirical Bayes can also be formulated in this manner, by setting λ appropriately. Interestingly, Hirano and Wright () show that forecasting models constructed using out‐of‐sample or split sample schemes perform well only when combined with other methods, such as bagging. Broadly speaking, their results offer a glimpse into the benefits of using state‐of‐the‐art (asymptotic) statistical analysis in order to examine new methods that combine conventional out‐of‐sample approaches to model selection and estimation with algorithmic approaches such as bagging.…”
Section: Dimension Reduction and Penalized Regressionmentioning
confidence: 99%
“…A closely related strand considers shrinkage (penalized regression) methods, including the likes of ridge regression, the least absolute shrinkage selection operator (lasso), the elastic net and the non‐negative garrote. These and other shrinkage related methods are discussed in Bai and Ng (, ), Schumacher (), Stock and Watson (), Kim and Swanson (, ) and Hirano and Wright (), for example. Broadly speaking, the number of such methods available to empiricists is now immense.…”
Section: Introductionmentioning
confidence: 99%
“…Source of Model Uncertainty. The uncertainty of the DDM arises from a series of factors, including wrong settings of the model, imperfect input information, and the inherent randomness of events and behaviors [21]. The PM and EPT are combined to establish the DDM, which fully embodies the comprehensive socioeconomic benefits of pipeline projects.…”
Section: Uncertainty Of Dominance Degree Modelmentioning
confidence: 99%
“…Second, machine learning methods have been employed to generate new statistical procedures specifically tailored for economic applications in recent work by Belloni, Chen, Chernozhukov, and Hansen (2012), Belloni, Chernozhukov, and Hansen (2014), Chernozhukov, Hansen, and Spindler (2015), Fan, Liao, and Yao (2015), Hirano and Wright (2017) and Caner and Kock (2018), to name a few. Boosting is one of the most successful machine learning methods.…”
Section: Introductionmentioning
confidence: 99%