2012
DOI: 10.21236/ada580738
|View full text |Cite
|
Sign up to set email alerts
|

An Efficient Augmented Lagrangian Method with Applications to Total Variation Minimization

Abstract: Based on the classic augmented Lagrangian multiplier method, we propose, analyze and test an algorithm for solving a class of equality-constrained non-smooth optimization problems (chiefly but not necessarily convex programs) with a particular structure. The algorithm effectively combines an alternating direction technique with a nonmonotone line search to minimize the augmented Lagrangian function at each iteration. We establish convergence for this algorithm, and apply it to solving problems in image reconst… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
106
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 100 publications
(109 citation statements)
references
References 22 publications
1
106
0
Order By: Relevance
“…ADM framework is very suitable for solving the proposed model in that the proposed model has two separate variables. Moreover, the convergence analysis and conclusions [25] guarantee that this kind of methodology is of efficient performance. Note that (8) becomes conventional constrained TV minimization when p = 1.…”
Section: Generalized Total P-variation Minimizationmentioning
confidence: 81%
See 1 more Smart Citation
“…ADM framework is very suitable for solving the proposed model in that the proposed model has two separate variables. Moreover, the convergence analysis and conclusions [25] guarantee that this kind of methodology is of efficient performance. Note that (8) becomes conventional constrained TV minimization when p = 1.…”
Section: Generalized Total P-variation Minimizationmentioning
confidence: 81%
“…With the aid of GPU accelerations of forward and backward projections [25], the time required for five and six projections are 1.1 and 1.2 s per iteration, respectively. In the first group of noiseless data simulations, six projections are used to perform the reconstruction.…”
Section: Simulation Experiments With Ideal Datamentioning
confidence: 99%
“…Our aim is to further verify the efficiency of the proposed strictly contractive PRSM (1.5) by comparing it numerically with four well-known algorithms in the imaging literature: SALSA [1], TwIST [4], SpaRSA [52], FISTA [2], and YALL1/TVAL3 [36, 37, 38, 54, 57]. …”
Section: Numerical Resultsmentioning
confidence: 99%
“…Due to the non-differentiability and non-linearity of the TV term in problem (2), this problem is computationally challenging to solve despite its simple form. In this paper, we propose to solve TV deconvolution problems [ ] 10 , 4 , 3 by linearized alternating direction method (LADM)-a variant of the classic augmented Lagrangian method for structured optimization.…”
Section: Osher and Fatemimentioning
confidence: 99%