2023
DOI: 10.1109/tpami.2023.3234160
|View full text |Cite
|
Sign up to set email alerts
|

Gradient Descent Ascent for Minimax Problems on Riemannian Manifolds

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(10 citation statements)
references
References 19 publications
0
10
0
Order By: Relevance
“…Moreover, the lower complexity bounds have also been studied for centralized minimax algorithms in (Zhang, Hong, and Zhang 2019;Zhang et al 2021a;. Additionally, more methods employing variance reduction have been considered to improve the performance of the stochastic minimax algorithms, e.g., see (Xu et al 2020;Huang, Wu, and Huang 2021;Luo et al 2020;Zhang, Aybat, and Gurbuzbalaban 2022). In this paper, to control the noise accumulation, we propose DGDA-VR, a decentralized method employing the SPIDER variance reduction technique (Fang et al 2018), a variant of SARAH (Nguyen et al 2017a).…”
Section: Related Workmentioning
confidence: 99%
“…Moreover, the lower complexity bounds have also been studied for centralized minimax algorithms in (Zhang, Hong, and Zhang 2019;Zhang et al 2021a;. Additionally, more methods employing variance reduction have been considered to improve the performance of the stochastic minimax algorithms, e.g., see (Xu et al 2020;Huang, Wu, and Huang 2021;Luo et al 2020;Zhang, Aybat, and Gurbuzbalaban 2022). In this paper, to control the noise accumulation, we propose DGDA-VR, a decentralized method employing the SPIDER variance reduction technique (Fang et al 2018), a variant of SARAH (Nguyen et al 2017a).…”
Section: Related Workmentioning
confidence: 99%
“…Subsequently, accelerated gradient descent ascent methods (Luo et al 2020;Xu et al 2020) emerged, leveraging variance reduction techniques to tackle stochastic minimax problems based on the variance reduction techniques. Additionally, research efforts have been made to explore non-smooth nonconvex-strongly-concave minimax optimization (Huang, Gao, and Huang 2020;Chen et al 2020). Moreover, (Huang, Gao, and Huang 2020) proposed the Riemannian stochastic gradient descent ascent method and some variants for the Riemannian minimax optimization problem.…”
Section: Related Workmentioning
confidence: 99%
“…Approaches such as Stochastic Gradient Descent Ascent (SGDA) have been proposed (Lin, Jin, and Jordan 2020), accompanied by innovations like variance-reduced SGDA (Luo et al 2020;Xu et al 2020) that aim to expedite convergence rates. Moreover, the application of Riemannian manifold-based optimization has been explored (Huang, Gao, and Huang 2020) across different minimax scenarios, showcasing the breadth of methodologies available. However, all of these methods are only designed for the non-compositional problem.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…More recently, Huang et al [2023], Junchi et al [2022], presented some efficient adaptive stochastic GDA methods for NC-SC minimax optimization. Meanwhile, Lu et al [2020], Chen et al [2021], Huang et al [2021b] proposed the effective and efficient two-timescale proximal GDA methods to solve NC-SC minimax problems with nonsmooth regularization. In general, the two-timescale GDA methods for minimax optimization can be more easily implemented than multi-step GDA methods, and perform better than the multi-step GDA methods in practice.…”
Section: Algorithmmentioning
confidence: 99%