2023
DOI: 10.48550/arxiv.2302.01463
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Convergence of Gradient Descent with Linearly Correlated Noise and Applications to Differentially Private Learning

Abstract: We study stochastic optimization with linearly correlated noise. Our study is motivated by recent methods for optimization with differential privacy (DP), such as DP-FTRL, which inject noise via matrix factorization mechanisms. We propose an optimization problem that distils key facets of these DP methods and that involves perturbing gradients by linearly correlated noise. We derive improved convergence rates for gradient descent in this framework for convex and nonconvex loss functions. Our theoretical analys… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 22 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?