Alternating direction methods are a common tool for general mathematical programming and optimization. These methods have become particularly important in the field of variational image processing, which frequently requires the minimization of nondifferentiable objectives. This paper considers accelerated (i.e., fast) variants of two common alternating direction methods: the alternating direction method of multipliers (ADMM) and the alternating minimization algorithm (AMA). The proposed acceleration is of the form first proposed by Nesterov for gradient descent methods. In the case that the objective function is strongly convex, global convergence bounds are provided for both classical and accelerated variants of the methods. Numerical examples are presented to demonstrate the superior performance of the fast methods for a wide variety of problems.
We examine the underlying structure of popular algorithms for variational methods used in image processing. We focus here on operator splittings and Bregman methods based on a unified approach via fixed point iterations and averaged operators. In particular, the recently proposed alternating split Bregman method can be interpreted from different points of view -as a Bregman, as an augmented Lagrangian and as a Douglas-Rachford splitting algorithm which is a classical operator splitting method. We also study similarities between this method and the forward-backward splitting method when applied to two frequently used models for image denoising which employ a Besov-norm and a total variation regularization term, respectively. In the first setting, we show that for a discretization based on Parseval frames the gradient descent reprojection and the alternating split Bregman algorithm are equivalent and turn out to be a frame shrinkage method. For the total variation regularizer, we also present a numerical comparison with multistep methods.Keywords Douglas-Rachford splitting · forward-backward splitting · Bregman methods · augmented Lagrangian method · alternating split Bregman algorithm · image denoising
As first demonstrated by Chambolle and Lions the staircasing effect of the RudinOsher-Fatemi model can be reduced by using infimal convolutions of functionals containing higher order derivatives. In this paper, we examine a modification of such infimal convolutions in a general discrete setting. For the special case of finite difference matrices, we show the relation of our approach to the continuous total generalized variation approach recently developed by Bredies, Kunisch and Pock. We present splitting methods to compute the minimizers of the ℓ 2 2 -(modified) infimal convolution functionals which are superior to previously applied second order cone programming methods. Moreover, we illustrate the differences between the ordinary and the modified infimal convolution approach by numerical examples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.