2021
DOI: 10.48550/arxiv.2106.16101
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

AdaGDA: Faster Adaptive Gradient Descent Ascent Methods for Minimax Optimization

Abstract: In the paper, we propose a class of faster adaptive gradient descent ascent methods for solving the nonconvex-strongly-concave minimax problems by using unified adaptive matrices used in the SUPER-ADAM . Specifically, we propose a fast adaptive gradient decent ascent (AdaGDA) method based on the basic momentum technique, which reaches a low sample complexity of O(κ 4 ǫ −4 ) for finding an ǫ-stationary point without large batches, which improves the existing result of adaptive minimax optimization method by a f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…[Guo and Yang, 2021, Ji et al, 2020, Hong et al, 2020, Chen et al, 2021a. Also, Luo et al [2020], Huang and Huang [2021] and Tran-Dinh et al [2020] explore variance reduced algorithms in this setting under the averaged smoothness assumption. Concurrently, Fiez et al [2021] prove perturbed GDA converges to -local minimax equilibria with complexities of Õ( −4 ) and Õ( −2 ) in stochastic and deterministic problems, respectively, under additional second-order conditions.…”
Section: Nc-sc Minimax Optimizationmentioning
confidence: 99%
“…[Guo and Yang, 2021, Ji et al, 2020, Hong et al, 2020, Chen et al, 2021a. Also, Luo et al [2020], Huang and Huang [2021] and Tran-Dinh et al [2020] explore variance reduced algorithms in this setting under the averaged smoothness assumption. Concurrently, Fiez et al [2021] prove perturbed GDA converges to -local minimax equilibria with complexities of Õ( −4 ) and Õ( −2 ) in stochastic and deterministic problems, respectively, under additional second-order conditions.…”
Section: Nc-sc Minimax Optimizationmentioning
confidence: 99%