Operator splitting schemes are a class of powerful algorithms that solve complicated monotone inclusion and convex optimization problems that are built from many simpler pieces. They give rise to algorithms in which all simple pieces of the decomposition are processed individually. This leads to easily implementable and highly parallelizable or distributed algorithms, which often obtain nearly state-of-the-art performance. In this paper, we analyze the convergence rate of the forwardDouglas-Rachford splitting (FDRS) algorithm, which is a generalization of the forward-backward splitting (FBS) and Douglas-Rachford splitting algorithms. Under general convexity assumptions, we derive the ergodic and nonergodic convergence rates of the FDRS algorithm, and show that these rates are the best possible. Under Lipschitz differentiability assumptions, we show that the best iterate of FDRS converges as quickly as the last iterate of the FBS algorithm. Under strong convexity assumptions, we derive convergence rates for a sequence that strongly converges to a minimizer. Under strong convexity and Lipschitz differentiability assumptions, we show that FDRS converges linearly. We also provide examples where the objective is strongly convex, yet FDRS converges arbitrarily slowly. Finally, we relate the FDRS algorithm to a primal-dual FBS scheme and clarify its place among existing splitting methods. Our results show that the FDRS algorithm automatically adapts to the regularity of the objective functions and achieves rates that improve upon the sharp worst case rates that hold in the absence of smoothness and strong convexity.
1761The forward-backward splitting (FBS) algorithm [23] is another technique for solving (1.1) when g is known to be smooth. In this case, the proximal operator of g is never evaluated. Instead, FBS combines gradient (forward) steps with respect to g and proximal (backward) steps with respect to f . FBS is especially useful when the proximal operator of g is complex and its gradient is simple to compute.Recently, the forward-Douglas-Rachford splitting (FDRS) algorithm [7] was proposed to combine DRS and FBS and extend their applicability (see Algorithm 1). More specifically, let V ⊆ H be a closed vector space and suppose g is smooth. Then FDRS applies to the following constrained problem:Problem (1.5) arises in the dual form soft-margin kernelized support vector machine classifier [14] in which C is a box constraint, b is 0, and A has rank one. Note that by the argument in (1.3), we can always assume that b = 0. Define the smooth function g(x) := (1/2) Qx, x + c, x , the indicator function f (x) := χ C (x) (which is 0 on C and ∞ elsewhere), and the vector space V := {x ∈ R d | Ax = 0}. With this notation, (1.5) is in the form (1.2) and, thus, FDRS can be applied. This splitting is nice because ∇g(x) = Qx + c is simple whereas the proximal operator of g requires a matrix inversion prox γg = (I R d +γQ) −1 •(I R d −γc), which is expensive for large-scale problems.
Goals, challenges, and approaches.This work se...