2019
DOI: 10.1134/s1064562419020042
|View full text |Cite
|
Sign up to set email alerts
|

Accelerated Primal-Dual Gradient Descent with Linesearch for Convex, Nonconvex, and Nonsmooth Optimization Problems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(13 citation statements)
references
References 6 publications
0
13
0
Order By: Relevance
“…For the star network, we can compare the complexity of Mirror-Prox with the complexity of the IBP running in O n 2 /ε 2 time per node [19]. Distributed Mirror-Prox has better dependence on ε, namely 1/ε, as well as the accelerated IBP with O n 2 √ n/ε complexity per node of [15]. However, as any regularization-based method, the (accelerated) IBP has a strong limitation regarding the regularization parameter: numerical unstability at small regularization parameter.…”
Section: Application Of Distributed Mirror-proxmentioning
confidence: 99%
“…For the star network, we can compare the complexity of Mirror-Prox with the complexity of the IBP running in O n 2 /ε 2 time per node [19]. Distributed Mirror-Prox has better dependence on ε, namely 1/ε, as well as the accelerated IBP with O n 2 √ n/ε complexity per node of [15]. However, as any regularization-based method, the (accelerated) IBP has a strong limitation regarding the regularization parameter: numerical unstability at small regularization parameter.…”
Section: Application Of Distributed Mirror-proxmentioning
confidence: 99%
“…In this section similarly with the concept of (δ, L, µ, m, V )-model in optimization we consider inexact model for VI with a stronger version of monotonicity condition (34).…”
Section: Inexact Model For Strongly Monotone VImentioning
confidence: 99%
“…We believe that our model is flexible enough to be extended for problems with primal-dual structure 2 [49,51,54], e.g. for problems with linear constraints [2,12,34,58]; for random block-coordinate descent [25]; for tensor methods [30,55]; for distributed optimization setting [17,18,64,68]; and adaptive stochastic optimization [36,61].…”
Section: Introductionmentioning
confidence: 99%
“…Then we analyze the convergence rate of our primal-dual version of the Algorithm 1. Note that the primal-dual analysis of the existing accelerated methods [66,2,7,12,23,24,25,26,33,50] does not apply since the dual problem is a stochastic optimization problem and we use additional randomization. Algorithm 6.1 of [18] applied to the dual problem (22) with stochastic inexact oracle ∇ Φ(λ, ξ, ξ) is listed as Algorithm C3.…”
Section: Proof Of Theoremmentioning
confidence: 99%