2022
DOI: 10.48550/arxiv.2202.03397
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-Start

Abstract: We analyze a general class of bilevel problems, in which the upper-level problem consists in the minimization of a smooth objective function and the lower-level problem is to find the fixed point of a smooth contraction map. This type of problems include instances of meta-learning, hyperparameter optimization and data poisoning adversarial attacks. Several recent works have proposed algorithms which warm-start the lower-level problem, i.e. they use the previous lower-level approximate solution as a staring poi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 16 publications
0
3
0
Order By: Relevance
“…There are also other works that incorporate momentum to accelerate the convergence rate; see e.g., (Chen et al, 2022;Khanduri et al, 2021;Guo and Yang, 2021;. After our initial conference submission, we have also noticed some concurrent works that are relevant to this work; e.g., (Dagreou et al, 2022;Grazzi et al, 2022;Li et al, 2022;Hu et al, 2022). Specifically, (Dagreou et al, 2022) proposed a SBO method with the variance-reduction technique and achieved optimal rate.…”
Section: Appendix a Additional Related Workmentioning
confidence: 78%
See 2 more Smart Citations
“…There are also other works that incorporate momentum to accelerate the convergence rate; see e.g., (Chen et al, 2022;Khanduri et al, 2021;Guo and Yang, 2021;. After our initial conference submission, we have also noticed some concurrent works that are relevant to this work; e.g., (Dagreou et al, 2022;Grazzi et al, 2022;Li et al, 2022;Hu et al, 2022). Specifically, (Dagreou et al, 2022) proposed a SBO method with the variance-reduction technique and achieved optimal rate.…”
Section: Appendix a Additional Related Workmentioning
confidence: 78%
“…Specifically, (Dagreou et al, 2022) proposed a SBO method with the variance-reduction technique and achieved optimal rate. And (Grazzi et al, 2022) proposed a SBO method that achieves the optimal rate without warm-start. The algorithms in (Dagreou et al, 2022;Grazzi et al, 2022) are not a case of the plain-vanilla SA update discussed in this work and thus its analysis is not applicable to our problem.…”
Section: Appendix a Additional Related Workmentioning
confidence: 99%
See 1 more Smart Citation