2020
DOI: 10.48550/arxiv.2005.14057
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Machine Learning Time Series Regressions with an Application to Nowcasting

Abstract: This paper introduces structured machine learning regressions for high-dimensional time series data potentially sampled at different frequencies. The sparse-group LASSO estimator can take advantage of such time series data structures and outperforms the unstructured LASSO. We establish oracle inequalities for the sparse-group LASSO estimator within a framework that allows for the mixing processes and recognizes that the financial and the macroeconomic data may have heavier than exponential tails. An empirical … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 31 publications
0
4
0
Order By: Relevance
“…Interestingly, our study complement theirs in terms of documenting that the (news-based) RF method is better than both the LASSO and the PCA across nearly all outcome variables and forecasting horizons. In contrast, Babii et al (2020) propose a new sparse-group LASSO estimator, and show that it performs favorably compared to other alternatives, especially when combined with using text as data, when nowcasting US GDP growth.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Interestingly, our study complement theirs in terms of documenting that the (news-based) RF method is better than both the LASSO and the PCA across nearly all outcome variables and forecasting horizons. In contrast, Babii et al (2020) propose a new sparse-group LASSO estimator, and show that it performs favorably compared to other alternatives, especially when combined with using text as data, when nowcasting US GDP growth.…”
Section: Introductionmentioning
confidence: 99%
“…By using ML techniques to form predictions our analysis also relates to recent research by Medeiros et al (2019) and Babii et al (2020). Whereas Medeiros et al (2019) use the FRED-MD dataset to compare ML models for inflation forecasting, we focus on the (textual) news versus hard economic data dimension when forecasting National Account Statistics.…”
Section: Introductionmentioning
confidence: 99%
“…Related Literature. The use of ridge regression is in fact common in the forecasting literature: Inoue and Kilian (2008) use ridge regularization for forecasting U.S. consumer price inflation and argue that it compares favorably with bagging techniques; De Mol et al (2008) obtain the ridge estimators in the Bayesian context for the purposes of forecasting; Ghosh et al (2019) again study the Bayesian ridge, this time however in the high-dimensional context; Coulombe (2020), Babii et al (2020) and Medeiros et al (2021) compare LASSO, ridge and other machine learning techniques for forecasting with large economic datasets. The ridge penalty is considered within a more general mixed 1 -2 penalization setting in Smeekes and Wijler (2018), who discuss the performance and robustness of penalized estimates for forecasting purposes.…”
Section: Introductionmentioning
confidence: 99%
“…An exception is Babii et al (2020) who recently used the sparse-group lasso to accommodate the dynamic nature of high-dimensional, mixed-frequency data. Nonetheless, they address univariate MIDAS regressions, leaving convex regularization of MF-VARs unexplored.…”
Section: Introductionmentioning
confidence: 99%