2021
DOI: 10.1137/20m1337600
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Search of First-Order Nash Equilibria in Nonconvex-Concave Smooth Min-Max Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(7 citation statements)
references
References 13 publications
0
7
0
Order By: Relevance
“…If f is further assumed to be smooth, Zhao [2020] developed a variant of AIPP-S algorithm which only requires an inexact gradient of ρ ξ at each iteration and attains the total gradient complexity of Õ(ǫ −3 ). On the other hand, the stationarity of f (•, •) is proposed for quantifying the efficiency in nonconvex-concave minimax optimization [Lu et al, 2019, Nouiehed et al, 2019, Kong and Monteiro, 2019, Ostrovskii et al, 2020. Using this notion of stationarity, Kong and Monteiro [2019] attains the rate of O(ǫ −2.5 ) but requires the exact gradient of ρ ξ at each iteration.…”
Section: References Gradient Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…If f is further assumed to be smooth, Zhao [2020] developed a variant of AIPP-S algorithm which only requires an inexact gradient of ρ ξ at each iteration and attains the total gradient complexity of Õ(ǫ −3 ). On the other hand, the stationarity of f (•, •) is proposed for quantifying the efficiency in nonconvex-concave minimax optimization [Lu et al, 2019, Nouiehed et al, 2019, Kong and Monteiro, 2019, Ostrovskii et al, 2020. Using this notion of stationarity, Kong and Monteiro [2019] attains the rate of O(ǫ −2.5 ) but requires the exact gradient of ρ ξ at each iteration.…”
Section: References Gradient Complexitymentioning
confidence: 99%
“…Using this notion of stationarity, Kong and Monteiro [2019] attains the rate of O(ǫ −2.5 ) but requires the exact gradient of ρ ξ at each iteration. Without this assumption, the current state-of-the-art rate is Õ(ǫ −2.5 ) achieved by our Algorithm 5 and the algorithm proposed by a concurrent work [Ostrovskii et al, 2020]. Both algorithms are based on constructing an auxiliary function f ǫ,y and applying an accelerated solver for minimax proximal steps.…”
Section: References Gradient Complexitymentioning
confidence: 99%
“…Similar to their minimization counterparts, non-convex constraints have been widely applicable to the minmax optimization as well [Heusel et al, 2017, Daskalakis and Panageas, 2018, Balduzzi et al, 2018, Jin et al, 2020. Recently there has been significant effort in proving tighter results either under more structured assumptions [Thekumprampil et al, 2019, Nouiehed et al, 2019, Lu et al, 2020, Azizian et al, 2020, Diakonikolas, 2020, Golowich et al, 2020, Lin et al, 2020c,b, Liu et al, 2021, Ostrovskii et al, 2021, Kong and Monteiro, 2021, and/or obtaining last-iterate convergence guarantees [Daskalakis and Panageas, 2018, 2019, Adolphs et al, 2019, Liang and Stokes, 2019, Gidel et al, 2019, Mazumdar et al, 2020, Mokhtari et al, 2020, Lin et al, 2020c, Hamedani and Aybat, 2021, Abernethy et al, 2021, Cai et al, 2022 for computing min-max solutions in convex-concave settings. Nonetheless, the analysis of the iteration complexity in the general non-convex non-concave setting is still in its infancy [Vlatakis-Gkaragkounis et al, 2019.…”
Section: Introductionmentioning
confidence: 99%
“…Recent years have witnessed a surge of research interest in designing algorithms for solving minimaxstructured nonconvex problems, e.g., [12,13,15,18,20,21,23,34,36,40,44,[46][47][48]. Different from most of the existing works, we are interested in designing a decentralized algorithm for solving (1.1).…”
mentioning
confidence: 99%