2021
DOI: 10.48550/arxiv.2111.03137
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Big-Step-Little-Step: Efficient Gradient Methods for Objectives with Multiple Scales

Abstract: We provide new gradient-based methods for efficiently solving a broad class of ill-conditioned optimization problems. We consider the problem of minimizing a function f : R d → R which is implicitly decomposable as the sum of m unknown non-interacting smooth, strongly convex functions and provide a method which solves this problem with a number of gradient evaluations that scales (up to logarithmic factors) as the product of the square-root of the condition numbers of the components. This complexity bound (whi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 64 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?