This paper studies gradient-based schemes for image denoising and deblurring problems based on the discretized total variation (TV) minimization model with constraints. We derive a fast algorithm for the constrained TV-based image deburring problem. To achieve this task, we combine an acceleration of the well known dual approach to the denoising problem with a novel monotone version of a fast iterative shrinkage/thresholding algorithm (FISTA) we have recently introduced. The resulting gradient-based algorithm shares a remarkable simplicity together with a proven global rate of convergence which is significantly better than currently known gradient projections-based methods. Our results are applicable to both the anisotropic and isotropic discretized TV functionals. Initial numerical results demonstrate the viability and efficiency of the proposed algorithms on image deblurring problems with box constraints.
In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order. Global sublinear rate of convergence of this method is established and it is shown that it can be accelerated when the problem is unconstrained. In the unconstrained setting we also prove a sublinear rate of convergence result for the so-called alternating minimization method when the number of blocks is two. When the objective function is also assumed to be strongly convex, linear rate of convergence is established.
Introduction.One of the first variable decomposition methods for solving general minimization problems is the so-called alternating minimization method [5,14], which is based on successive global minimization with respect to each component vector in a cyclic order. This fundamental method appears in the literature under various names such as the block-nonlinear Gauss-Seidel method or the block coordinate descent method (see, e.g., [4]). The convergence of the method was extensively studied in the literature under various assumptions. For example, Auslender studied in [1] the convergence of the method under a strong convexity assumption, but without assuming differentiability. In [4] Bertsekas showed that if the minimum with respect to each block of variables is unique, then any accumulation point of the sequence generated by the method is also a stationary point. Grippo and Sciandrone showed in [7] convergence results of the sequence generated by the method under different sets of assumptions such as strict quasi convexity with respect to each block. Luo and Tseng proved in [9] that under the assumptions of strong convexity with respect to each block, existence of a local error bound of the objective function, and proper separation of isocost surfaces, linear rate of convergence can be established.Another closely related method, which will be the main focus of this paper, is the block coordinate gradient projection (BCGP) method in which at each subiteration, the exact minimization with respect to a certain block of variables is replaced with an employment of a single step of the gradient projection method (a step toward the gradient followed by an orthogonal projection). This method has a clear advantage over alternating minimization when exact minimization with respect to each of the
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.