We introduce a framework for quasi-Newton forward-backward splitting algorithms (proximal quasi-Newton methods) with a metric induced by diagonal ± rank-r symmetric positive definite matrices. This special type of metric allows for a highly efficient evaluation of the proximal mapping. The key to this efficiency is a general proximal calculus in the new metric. By using duality, formulas are derived that relate the proximal mapping in a rank-r modified metric to the original metric. We also describe efficient implementations of the proximity calculation for a large class of functions; the implementations exploit the piece-wise linear nature of the dual problem. Then, we apply these results to acceleration of composite convex minimization problems, which leads to elegant quasi-Newton methods for which we prove convergence. The algorithm is tested on several numerical examples and compared to a comprehensive list of alternatives in the literature. Our quasi-Newton splitting algorithm with the prescribed metric compares favorably against state-of-the-art. The algorithm has extensive applications including signal processing, sparse recovery, machine learning and classification to name a few.
Contributions.We introduce an general proximal calculus in a metric V = P ± Q ∈ S ++ (N ) given by P ∈ S ++ (N ) and a positive semi-definite rank-r matrix Q. This significantly extends the result in the preliminary version of this paper [7], where only V = P + Q with a rank-1 matrix Q is addressed. The general calculus is accompanied by several more concrete examples (see Section 3.3.4 for a non-exhaustive list), where, for example, the piecewise linear nature of certain dual problems is rigorously exploited.