In this paper, we follow the work [17] to study quasi-Newton methods, which is based on the updating formulas from a certain subclass of the Broyden family. We focus on the common SR1 and BFGS quasi-Newton methods to establish better explicit superlinear convergence. First, based on greedy quasi-Newton update in [17], which greedily selected the direction so as to maximize a certain measure of progress, we improve the linear convergence rate to a condition-number-free superlinear convergence rate, when applied with the well-known SR1 update, and BFGS update. Moreover, our results can also be applied to the inverse approximation of the SR1 update. Second, based on random update, that selects the direction randomly from any spherical symmetry distribution we show the same superlinear convergence rate established as above. Our analysis is closely related to the approximation of a given Hessian matrix, unconstrained quadratic objective, as well as the general strongly convex, smooth and strongly self-concordant functions.
We study the convergence rate of the famous Symmetric Rank-1 (SR1) algorithm which has wide applications in different scenarios. Although it has been extensively investigated, SR1 still lacks a non-asymptotic superlinear rate compared with other quasi-Newton methods such as DFP and BFGS. In this paper we address this problem. Inspired by the recent work on explicit convergence analysis of quasi-Newton methods, we obtain the first explicit non-asymptotic rates of superlinear convergence for the vanilla SR1 methods with correction strategy to achieve the numerical stability. Specifically, the vanilla SR1 with the correction strategy achieves the rates of the form 4n ln(eκ)for general smooth strongly-convex functions where k is the iteration counter, κ is the condition number of the objective function and n is the dimension of the problem. For the quadratic function, the vanilla SR1 algorithm can find the optima of the objective function at most n steps.
In this paper, we study the explicit superlinear convergence rate of quasi-Newton methods. We particularly focus on the classical Broyden's methods for solving nonlinear equations and establish their explicit (local) superlinear convergence rates when the initial point is close enough to a solution and the approximate Jacobian is close enough to the exact Jacobian related to the solution. Our results provide the explicit superlinear convergence rates of the Broyden's "good" and "bad" methods for the first time. The explicit convergence rates provide some important insights on the performance difference between the "good" and "bad" methods. The theoretical findings in the convergence analysis of Broyden's methods are also validated empirically in this paper.
In this paper, we propose the greedy and random Broyden's method for solving nonlinear equations. Specifically, the greedy method greedily selects the direction to maximize a certain measure of progress for approximating the current Jacobian matrix, while the random method randomly chooses a direction. We establish explicit (local) superlinear convergence rates of both methods if the initial point and approximate Jacobian are close enough to a solution and corresponding Jacobian. Our two novel variants of Broyden's method enjoy two important advantages that the approximate Jacobians of our algorithms will converge to the exact ones and the convergence rates of our algorithms are asymptotically faster than the original Broyden's method. Our work is the first time to achieve such two advantages theoretically. Our experiments also empirically validate the advantages of our algorithms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.