2012
DOI: 10.1109/tip.2012.2188033
|View full text |Cite
|
Sign up to set email alerts
|

An Alternating Direction Algorithm for Total Variation Reconstruction of Distributed Parameters

Abstract: Augmented Lagrangian variational formulations and alternating optimization have been adopted to solve distributed parameter estimation problems. The alternating direction method of multipliers (ADMM) is one of such formulations/optimization methods. Very recently, the number of applications of the ADMM, or variants of it, to solve inverse problems in image and signal processing has increased at an exponential rate. The reason for this interest is that ADMM decomposes a difficult optimization problem into a seq… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(12 citation statements)
references
References 31 publications
0
12
0
Order By: Relevance
“…Our method was implemented in MATLAB R2015b, and all experiments were performed on a server with the following hardware specifications: 64 Intel(R) Xeon(R) 2.30GHz CPUs with 8 cores, and 128 GB RAM. For sGC, we used the publicly available B-K MATLAB tool 6 , which implements the max-flow algorithm. In the next sections, we present the results obtained for high resolution 2D multichannel images, 3D MRI data and a squared curvature regularization example.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Our method was implemented in MATLAB R2015b, and all experiments were performed on a server with the following hardware specifications: 64 Intel(R) Xeon(R) 2.30GHz CPUs with 8 cores, and 128 GB RAM. For sGC, we used the publicly available B-K MATLAB tool 6 , which implements the max-flow algorithm. In the next sections, we present the results obtained for high resolution 2D multichannel images, 3D MRI data and a squared curvature regularization example.…”
Section: Methodsmentioning
confidence: 99%
“…LSA-TR outperforms significantly popular non-submodular optimization techniques such as TRWS and QPBO; See the comparative energy plots in[10] 6. http://vision.csd.uwo.ca/code/ Visual example of 2D image segmentation by sGC and DOPE approaches (left) and the evolution of the segmentation energy in both formulations with respect to the number of iterations (right).…”
mentioning
confidence: 99%
“…which attempts to preserve the sparsity of the vector x = T f in the domain of the representation basis , and the smoothness of f in the spatial domain via the operator L, which enforces piecewise constant solutions and is associated with the T V regularizer (see [26] for more details). Note that || • || 1 is the l 1 norm and that λ 1 , λ 2 are two regularization parameters.…”
Section: Fusing Compressed Hs and Ms Images Using A Regularized Imentioning
confidence: 99%
“…v 0 and v (in (24)) can be solved similarly. The Lagrangians L 0 (u 0 ) and L 0 (v 0 ) being non-differentiable, we update these parameters by using soft thresholding operations, which result from the computation of appropriate proximal operators (see [26], [27] for details). Algorithm 3 summarizes how each variable u and v can be updated from the minimization of their respective Lagrangians where…”
Section: Algorithmmentioning
confidence: 99%
“…In recent years, a large number of TV methods have been extensively studied for additive noise removal [3,4], most of which are convex variation models. The convex models can be optimized using simple and reliable numerical methods, such as the gradient descent [5], primal-dual formulation [6], alternating direction method of multipliers [7], and Bregmanized operator splitting [8].…”
Section: Introductionmentioning
confidence: 99%