Emilie Chouzenoux, Jean-Christophe Pesquet and Audrey Repetti * Abstract We consider the minimization of a function G defined on R N , which is the sum of a (non necessarily convex) differentiable function and a (non necessarily differentiable) convex function. Moreover, we assume that G satisfies the Kurdyka-Lojasiewicz property. Such a problem can be solved with the Forward-Backward algorithm. However, the latter algorithm may suffer from slow convergence. We propose an acceleration strategy based on the use of variable metrics and of the Majorize-Minimize principle. We give conditions under which the sequence generated by the resulting Variable Metric Forward-Backward algorithm converges to a critical point of G. Numerical results illustrate the performance of the proposed algorithm in an image reconstruction application.
A number of recent works have emphasized the prominent role played by the KurdykaLojasiewicz inequality for proving the convergence of iterative algorithms solving possibly nonsmooth/nonconvex optimization problems. In this work, we consider the minimization of an objective function satisfying this property, which is a sum of a non necessarily convex differentiable function and a non necessarily differentiable or convex function. The latter function is expressed as a separable sum of functions of blocks of variables. Such an optimization problem can be addressed with the Forward-Backward algorithm which can be accelerated thanks to the use of variable metrics derived from the Majorize-Minimize principle. We propose to combine the latter acceleration technique with an alternating minimization strategy which relies upon a flexible update rule. We give conditions under which the sequence generated by the resulting Block Coordinate Variable Metric Forward-Backward algorithm converges to a critical point of the objective function. An application example to a nonconvex phase retrieval problem encountered in signal/image processing shows the efficiency of the proposed optimization method.
In the context of next generation radio telescopes, like the Square Kilometre Array, the efficient processing of large-scale datasets is extremely important. Convex optimisation tasks under the compressive sensing framework have recently emerged and provide both enhanced image reconstruction quality and scalability to increasingly larger data sets. We focus herein mainly on scalability and propose two new convex optimisation algorithmic structures able to solve the convex optimisation tasks arising in radio-interferometric imaging. They rely on proximal splitting and forward-backward iterations and can be seen, by analogy with the clean major-minor cycle, as running sophisticated clean-like iterations in parallel in multiple data, prior, and image spaces. Both methods support any convex regularisation function, in particular the well studied 1 priors promoting image sparsity in an adequate domain. Tailored for big-data, they employ parallel and distributed computations to achieve scalability, in terms of memory and computational requirements. One of them also exploits randomisation, over data blocks at each iteration, offering further flexibility. We present simulation results showing the feasibility of the proposed methods as well as their advantages compared to state-of-the-art algorithmic solvers. Our Matlab code is available online on GitHub.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.