2022
DOI: 10.1609/aaai.v36i7.20701
|View full text |Cite
|
Sign up to set email alerts
|

Fast and Robust Online Inference with Stochastic Gradient Descent via Random Scaling

Abstract: We develop a new method of online inference for a vector of parameters estimated by the Polyak-Ruppert averaging procedure of stochastic gradient descent (SGD) algorithms. We leverage insights from time series regression in econometrics and construct asymptotically pivotal statistics via random scaling. Our approach is fully operational with online data and is rigorously underpinned by a functional central limit theorem. Our proposed inference method has a couple of key advantages over the existing methods. Fi… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 22 publications
0
4
0
Order By: Relevance
“…Because CSA2SLS is computationally intensive, an interesting future research question would be to develop a more efficient computation algorithm. An approach based on the stochastic gradient descent (see, for example, Lee et al [2022]) can be a possible solution.…”
Section: Discussionmentioning
confidence: 99%
“…Because CSA2SLS is computationally intensive, an interesting future research question would be to develop a more efficient computation algorithm. An approach based on the stochastic gradient descent (see, for example, Lee et al [2022]) can be a possible solution.…”
Section: Discussionmentioning
confidence: 99%
“…The work in Zhu, Chen, and Wu (2023) extends the batch-mean estimator in Chen et al (2020) to a fully online version. The work in Lee et al (2022) proposes a random scaling covariance estimator for robust online inference with SGD. Subsequent work in Li, Liang, and Zhang (2023) investigates online statistical inference using nonlinear SA with Markovian data.…”
Section: Related Workmentioning
confidence: 99%
“…Recent works have explored using SA and SGD iterates to perform statistical inference, e.g., constructing confidence intervals (CIs) around a point estimate (Li et al 2018;Chen et al 2020;Li et al 2022;Lee et al 2022;Liu, Chen, and Shang 2023). This approach is computationally cheap and scales well with the size and dimension of the dataset: SA updates are computed iteratively, without storing or multiple passes over the dataset.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, heteroscedasticity and autocorrelation consistent (HAC) estimators, for example, Newey and West ( 1987 ), Andrews ( 1991b ), to name but two examples, have been followed by the fixed‐bandwidth kernel approach to obtain an asymptotically pivotal and mixed‐normal test; see, for example, Kiefer, Vogelsang, and Bunzel ( 2000 ) and Lazarus et al ( 2018 ) for a recent review. In the machine learning literature, Lee, Liao, Seo, and Shin ( 2022 ) also employ the random scaling approach for computationally efficient online inference based on the stochastic gradient descent algorithm.…”
Section: Scriptv Robust Testingmentioning
confidence: 99%