2012
DOI: 10.1137/110836936
|View full text |Cite
|
Sign up to set email alerts
|

On the $O(1/n)$ Convergence Rate of the Douglas–Rachford Alternating Direction Method

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

12
669
0
1

Year Published

2013
2013
2022
2022

Publication Types

Select...
8
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 827 publications
(682 citation statements)
references
References 20 publications
12
669
0
1
Order By: Relevance
“…It is known that in this case every limit point of the iterates is an optimal solution of the problem. The recent work of [21,22,26] have shown that, under some additional assumptions, the objective values generated by the ADMM algorithm and its accelerated version (which performs some additional line search steps for the dual update) converge at a rate of O(1/r) and O(1/r 2 ) respectively. Moreover, if the objective function f (x) is strongly convex and the constraint matrix E is row independent, then the ADMM is known to converge linearly to the unique minimizer of (1.1) [33].…”
Section: Alternating Direction Methods Of Multipliers (Admm)mentioning
confidence: 99%
“…It is known that in this case every limit point of the iterates is an optimal solution of the problem. The recent work of [21,22,26] have shown that, under some additional assumptions, the objective values generated by the ADMM algorithm and its accelerated version (which performs some additional line search steps for the dual update) converge at a rate of O(1/r) and O(1/r 2 ) respectively. Moreover, if the objective function f (x) is strongly convex and the constraint matrix E is row independent, then the ADMM is known to converge linearly to the unique minimizer of (1.1) [33].…”
Section: Alternating Direction Methods Of Multipliers (Admm)mentioning
confidence: 99%
“…(13a) and (13b) (Eckstein and Bertsekas 1992). Recent theoretical analysis on convergence behavior of ADMM can also be found, for example, in He and Yuan (2012). Although a detailed theoretical analysis of the proposed method is beyond the scope of our current study, the basic characteristics can be understood from these general results.…”
Section: Stopping Criterion and Final Estimates Ofmentioning
confidence: 91%
“…For γ > 0 and μ ∈ [0, 2], the iterates α k converge to the minimizer of equation (2.5). For more details on the convergence of the Douglas-Rachford algorithm, see [34,35]. The coefficients identified by equation (2.5), denoted α * , will typically have some bias due to the shrinkage function.…”
Section: (B) Numerical Methods and Algorithmmentioning
confidence: 99%