2020
DOI: 10.48550/arxiv.2002.03979
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Online Covariance Matrix Estimation in Stochastic Gradient Descent

Abstract: Stochastic gradient descent (SGD) algorithm is widely used for parameter estimation especially in online setting. While this recursive algorithm is popular for computation and memory efficiency, the problem of quantifying variability and randomness of the solutions has been rarely studied. This paper aims at conducting statistical inference of SGD-based estimates in online setting. In particular, we propose a fully online estimator for the covariance matrix of averaged SGD iterates (ASGD). Based on the classic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
11
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(11 citation statements)
references
References 36 publications
0
11
0
Order By: Relevance
“…We emphasize that this is a non-trivial problem even in the first-order setting, as the iterates x t form an inhomogeneous Markov chain. In this section, we leverage the recent work by [ZCW20], who proposed an online estimator of the asymptotic covariance matrix in the stochastic first-order setting, and extend it to the stochastic zeroth-order setting and propose an estimator of the covariance matrix Ṽ appearing in Proposition 2.1.…”
Section: Online Estimation Of Asymptotic Covariance Matrixmentioning
confidence: 99%
See 4 more Smart Citations
“…We emphasize that this is a non-trivial problem even in the first-order setting, as the iterates x t form an inhomogeneous Markov chain. In this section, we leverage the recent work by [ZCW20], who proposed an online estimator of the asymptotic covariance matrix in the stochastic first-order setting, and extend it to the stochastic zeroth-order setting and propose an estimator of the covariance matrix Ṽ appearing in Proposition 2.1.…”
Section: Online Estimation Of Asymptotic Covariance Matrixmentioning
confidence: 99%
“…As suggested in [ZCW20], this estimator can be calculated recursively via Algorithm 1. The main difference from [ZCW20] is the use of the stochastic zeroth-order gradient, due to which, several assumptions made in [ZCW20] are not satisfied in the stochastic zeroth-order setting we consider.…”
Section: Online Estimation Of Asymptotic Covariance Matrixmentioning
confidence: 99%
See 3 more Smart Citations