2021
DOI: 10.48550/arxiv.2103.16154
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Convergence on a symmetric accelerated stochastic ADMM with larger stepsizes

Abstract: In this paper, we develop a symmetric accelerated stochastic Alternating Direction Method of Multipliers (SAS-ADMM) for solving separable convex optimization problems with linear constraints. The objective function is the sum of a possibly nonsmooth convex function and an average function of many smooth convex functions. Our proposed algorithm combines both ideas of ADMM and the techniques of accelerated stochastic gradient methods using variance reduction to solve the smooth subproblem. One main feature of SA… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…We remark that the idea of Algorithm 6 can be applied to stochastic ADMM or its multi-block extensions in the recent works in [10,11] for solving problems arising in machine learning because the basic model therein is a two-block separable convex optimization problem.…”
Section: Version Of Alternating Direction Methods Of Multipliers (Admm)mentioning
confidence: 99%
See 1 more Smart Citation
“…We remark that the idea of Algorithm 6 can be applied to stochastic ADMM or its multi-block extensions in the recent works in [10,11] for solving problems arising in machine learning because the basic model therein is a two-block separable convex optimization problem.…”
Section: Version Of Alternating Direction Methods Of Multipliers (Admm)mentioning
confidence: 99%
“…We observe that since T := I − J A λ is Lipschitz continuous, there is some flexibility in this choice. Therefore, (11) becomes…”
Section: Motivations From Dynamical Systemsmentioning
confidence: 99%