2019
DOI: 10.2139/ssrn.3503191
|View full text |Cite
|
Sign up to set email alerts
|

Estimation and HAC-based Inference for Machine Learning Time Series Regressions

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 86 publications
(106 reference statements)
0
3
0
Order By: Relevance
“…The ridge regression would typically require m/T → 0, cf., Carrasco et al (2007b). The LASSO would require the approximate sparsity, somewhat stronger weak dependence conditions, and m 1/κ /T 1−1/κ → 0, where κ measures tails and weak dependence, cf., Babii et al (2019).…”
Section: Monte Carlo Experimentsmentioning
confidence: 99%
See 1 more Smart Citation
“…The ridge regression would typically require m/T → 0, cf., Carrasco et al (2007b). The LASSO would require the approximate sparsity, somewhat stronger weak dependence conditions, and m 1/κ /T 1−1/κ → 0, where κ measures tails and weak dependence, cf., Babii et al (2019).…”
Section: Monte Carlo Experimentsmentioning
confidence: 99%
“…3 In particular, Belloni et al (2012) propose to use the LASSO to address the problem of many instruments and the nonparametric series estimation of the optimal instrument. Our mixed-frequency IV regression is qualitatively different from the above models and does not impose the approximate sparsity on the highdimensional slope coefficients, see Babii, Ghysels, and Striaukas (2019) for a comprehensive treatment of approximately sparse mixed-frequency time series regressions. The problem of the optimal instrument is more challenging in our nonparametric setting and is left for future research, see Florens and Sokullu (2018) for some steps in this direction.…”
Section: Introductionmentioning
confidence: 99%
“…The methodologies we employ have been applied in the context of macroeconomic forecasting using a large number of predictors. Specifically, studies using shrinkage methods to examine the predictability of key macroeconomic indicators include Bai and Ng (2008), De Mol et al (2008), Stock and Watson (2012), Carrasco and Rossi (2016), Kotchoni et al (2019), and Babii et al (2019). The advantages of machine learning in the context of asset pricing and return predictability have been explored among others by Rapach et al (2013), Neely et al (2014), Lima and Meng (2017), Kelly et al (2019), Rapach et al (2019), and Kozak (2019).…”
Section: Introductionmentioning
confidence: 99%