2014
DOI: 10.1090/s0025-5718-2014-02829-9
|View full text |Cite
|
Sign up to set email alerts
|

An augmented Lagrangian based parallel splitting method for separable convex minimization with applications to image processing

Abstract: This paper considers the convex minimization problem with linear constraints and a separable objective function which is the sum of many individual functions without coupled variables. An algorithm is developed by splitting the augmented Lagrangian function in a parallel way. The new algorithm differs substantially from existing splitting methods in alternating style which require solving the decomposed subproblems sequentially, while it remains the main superiority of existing splitting methods in that the re… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
69
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
5
1

Relationship

2
4

Authors

Journals

citations
Cited by 70 publications
(69 citation statements)
references
References 53 publications
0
69
0
Order By: Relevance
“…The second one is Jacobiantype splitting method, which decomposes the quadratic term in a Jacobian way such that the resulting subproblems can be solved simultaneously. Specifically, for given initial points x k 1 , · · · , x k m , λ k , they first generate the predictor via ( 1.3b) then update the next iterate via an additional correction step to guarantee their global convergence, e.g., see [25,28,30,31] for more details. It is clear that both types of splitting methods can make full use of the separable property.…”
Section: Augmented-lagrangian-based Splitting Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…The second one is Jacobiantype splitting method, which decomposes the quadratic term in a Jacobian way such that the resulting subproblems can be solved simultaneously. Specifically, for given initial points x k 1 , · · · , x k m , λ k , they first generate the predictor via ( 1.3b) then update the next iterate via an additional correction step to guarantee their global convergence, e.g., see [25,28,30,31] for more details. It is clear that both types of splitting methods can make full use of the separable property.…”
Section: Augmented-lagrangian-based Splitting Methodsmentioning
confidence: 99%
“…From the data reported in [35,49], we observe that less number of correction variables may get more promising numerical results. Moreover, the numerical results in [28] show that Jacobian-type splitting methods outperform Gauss-Seidel-type methods for solving large-scale problems in terms of computing time. Thus, we hopefully explore an algorithm that not only possess some easier subproblems in many cases, but also inherits the advantages of Gauss-Seidel-type and Jacobian-type splitting methods such as less correction (prediction) variables and simultaneous implementation on subproblems.…”
Section: Augmented-lagrangian-based Splitting Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The scheme (1.7), however, is not necessarily convergent even when m = 2, as shown in [22]. In the literature, it was suggested to correct the output of (1.7) by some correction steps to ensure the convergence; some prediction-correction methods based on the Jacobian decomposition of ALM (1.7) were thus presented in the literature, see, e.g., [20,22]. Note that these prediction-correction methods usually converge fast for some applications arising in image processing and other areas.…”
Section: The Alm With Full Jacobian Decomposition and Lqp Regularizationmentioning
confidence: 99%