In this paper, we propose a three-term conjugate gradient method via the symmetric rank-one update. The basic idea is to exploit the good properties of the SR1 update in providing quality Hessian approximations to construct a conjugate gradient line search direction without the storage of matrices and possess the sufficient descent property. Numerical experiments on a set of standard unconstrained optimization problems showed that the proposed method is superior to many well-known conjugate gradient methods in terms of efficiency and robustness.
In this paper, we present a new approach for solving fuzzy nonlinear equations. Our approach requires to compute the Jacobian matrix once throughout the iterations unlike some Newton’s-like methods which needs to compute the Jacobian matrix in every iterations. The fuzzy coefficients are presented in parametric form. Numerical results on well-known benchmarks fuzzy nonlinear equations are reported to authenticate the effectiveness and efficiency of the approach.
Conjugate gradient methods have played a useful and powerful role for solving large-scale optimization problems which has become more interesting and essential in many disciplines such as in engineering, statistics, physical sciences, social and behavioral sciences among others. In this paper, we present an application of a proposed three-term conjugate gradient method in regression analysis. Numerical experiments show that the proposed method is promising and superior to many well-known conjugate gradient methods in terms of efficiency and robustness.
Symmetric rank-one update (SR1) is known to have good numerical performance among the quasi-Newton methods for solving unconstrained optimization problems as evident from the recent study of Farzin et al. (2011), However, it is well known that the SR1 update may not preserve positive definiteness even when updated from a positive definite approximation and can be undefined with zero denominator. In this paper, we propose some scaling strategies to overcome these well known shortcomings of the SR1 update. Numerical experiment showed that the proposed strategies are very competitive, encouraging and have exhibited a clear improvement in the numerical performance over SR1 algorithms with some existing strategies in avoiding zero denominator and preserving positive-definiteness.
Mathematics Subject Classification: 65H11, 65K05
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.