In this work we present and analyze a new scaled conjugate gradient algorithm and its implementation, based on an interpretation of the secant equation and on the inexact Wolfe line search conditions. The best spectral conjugate gradient algorithm SCG by Birgin and Martínez (2001), which is mainly a scaled variant of Perry's (1977), is modified in such a manner to overcome the lack of positive definiteness of the matrix defining the search direction. This modification is based on the quasi-Newton BFGS updating formula. The computational scheme is embedded in the restart philosophy of Beale-Powell. The parameter scaling the gradient is selected as spectral gradient or in an anticipative manner by means of a formula using the function values in two successive points. In very mild conditions it is shown that, for strongly convex functions, the algorithm is global convergent. Preliminary computational results, for a set consisting of 500 unconstrained optimization test problems, show that this new scaled conjugate gradient algorithm substantially outperforms the spectral conjugate gradient SCG algorithm.
Another hybrid conjugate gradient algorithm is subject to analysis. The parameter β k is computed as a convex combination of b HS k (Hestenes-Stiefel) andThe parameter θ k in the convex combination is computed in such a way so that the direction corresponding to the conjugate gradient algorithm to be the Newton direction and the pair (s k , y k ) to satisfy the quasi-Newton equation r 2 f x kþ1 ð Þs k ¼ y k , where s k ¼ x kþ1 À x k and y k ¼ g kþ1 À g k . The algorithm uses the standard Wolfe line search conditions. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms the Hestenes-Stiefel and the Dai-Yuan conjugate gradient algorithms as well as the hybrid conjugate gradient algorithms of Dai and Yuan. A set of 750 unconstrained optimization problems are used, some of them from the CUTE library.
In this paper we introduce an acceleration of gradient descent algorithm with backtracking. The idea is to modify the steplength t k by means of a positive parameter θ k , in a multiplicative manner, in such a way to improve the behaviour of the classical gradient algorithm. It is shown that the resulting algorithm remains linear convergent, but the reduction in function value is significantly improved.
An accelerated hybrid conjugate gradient algorithm represents the subject of this paper. The parameter β k is computed as a convex combination of β HS k (Hestenes and Stiefel, J Res Nat Bur Stand 49:409-436, 1952) and β DY k (Dai and Yuan, SIAM J Optim 10: [177][178][179][180][181][182] 1999)The parameter θ k in the convex combinaztion is computed in such a way the direction corresponding to the conjugate gradient algorithm is the best direction we know, i.e. the Newton direction, while the pair (s k , y k ) satisfies the modified secant condition given by Li et al. (J Comput Appl Math 202:523-539, 2007) B k+1 s k = z k , where z k = y k + η k / s k 2 s k , η k = 2 ( f k − f k+1 ) + (g k + g k+1 ) T s k , s k = x k+1 − x k and y k = g k+1 − g k . It is shown that both for uniformly convex functions and for general nonlinear functions the algorithm with strong Wolfe line search is globally convergent. The algorithm uses an acceleration scheme modifying the steplength α k for improving the reduction of the function values along the iterations. Numerical comparisons with conjugate gradient algorithms show that this hybrid computational scheme outperforms a variant of the hybrid conjugate gradient algorithm given by Andrei (Numer Algorithms 47:143-156, 2008), in which the pair (s k , y k ) satisfies the classical secant condition B k+1 s k = y k , as well as some other conjugate gradient algorithms including Hestenes-Stiefel, Dai-Yuan, 24 Numer Algor (2010) 54:23-46 set of 75 unconstrained optimization problems with 10 different dimensions is being used (Andrei, Adv Model Optim 10:147-161, 2008).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.