2021
DOI: 10.1007/978-3-030-91059-4_18
|View full text |Cite
|
Sign up to set email alerts
|

Near-Optimal Decentralized Algorithms for Saddle Point Problems over Time-Varying Networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 6 publications
(5 citation statements)
references
References 11 publications
0
5
0
Order By: Relevance
“…The reduction of Algorithm 2 of [24] for deterministic case achieves optimal complexity bounds for deterministic setup. Also inexact oracle technique was used in [7,6] to build algorithms optimal up to log(1/ε). On the other hand, adding regularization (Lemma 12) to the results of [24] allows to avoid squared logarithmic factors and gives an algorithm with O χ(L ℓ D 2 /ε) log(1/ε) communication and O (L ℓ D 2 /ε) log(1/ε) oracle complexities.…”
Section: Gapmentioning
confidence: 99%
“…The reduction of Algorithm 2 of [24] for deterministic case achieves optimal complexity bounds for deterministic setup. Also inexact oracle technique was used in [7,6] to build algorithms optimal up to log(1/ε). On the other hand, adding regularization (Lemma 12) to the results of [24] allows to avoid squared logarithmic factors and gives an algorithm with O χ(L ℓ D 2 /ε) log(1/ε) communication and O (L ℓ D 2 /ε) log(1/ε) oracle complexities.…”
Section: Gapmentioning
confidence: 99%
“…Also in [17], the authors obtained lower complexity bounds for solving VIs satisfying Assumptions 1, 2 and 3 with stochastic methods. It turns out that the conclusions of Theorem 4 in the monotone case and estimate (16) are optimal and meet lower bounds up to numerical constants.…”
Section: General Casementioning
confidence: 87%
“…Algorithm 1 and Algorithm 2 are compared with the other methods in Table 1: Extra Gradient [36,31], Local SGDA [20], FedAvg-S [27], SCAFFOLD-S [27], SCAFFOLD-Catalyst-S [27], ESTVGM [10], SMMDS [15], MASHA [14], Optimistic MASHA [11], Accelerated Extra Gradient [37]. The algorithms are implemented in Python 3.7.…”
Section: Methodsmentioning
confidence: 99%
“…For an optimal (up to log factor) method in the general case, we refer to (Tian et al, 2022). For SPPs, optimal (up to a log factor) algorithms were proposed by Beznosikov et al (2021) along with the lower bounds on communication complexity: δ/ϵ for the convex-concave SPPs, and δ/µ for the strongly convex-strongly concave SPPs, where δ is the parameter of similarity. These lower bounds enjoy more favorable dependencies on the optimization parameters whenever δ ≪ L. Kovalev et al (2022) improved the results of Tian et al (2022) and Beznosikov et al (2021) by proposing optimal methods for convex minimization and convex-concave SPPs with optimal computational complexity, i.e.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation