2022
DOI: 10.48550/arxiv.2203.01744
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Accelerated SGD for Non-Strongly-Convex Least Squares

Abstract: We consider stochastic approximation for the least squares regression problem in the non-strongly convex setting. We present the first practical algorithm that achieves the optimal prediction error rates in terms of dependence on the noise of the problem, as O(d/t) while accelerating the forgetting of the initial conditions to O(d/t 2 ). Our new algorithm is based on a simple modification of the accelerated gradient descent. We provide convergence results for both the averaged and the last iterate of the algor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 19 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?