The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.
In unconstrained optimization algorithms, we employ the memoryless quasi Newton procedure to construct a new conjugacy coefficient for the conjugate gradient approaches. This newer updating formula was adapted by scaling the well-known broyden fletcher glodfarb shanno (BFGS) formula by a selfscaling factor in order to reach to the new form of the conjugacy coefficient which makes a satisfactory result in the descent direction and satisfies the globally convergent features when compared the proposed method to HS standard conjugate gradient approach. The theorems are studied in detail and moreover the numerical results of this paper is depend on a Fortran programming which are extremely stable.
Recently, the unconstrained optimization conjugate gradient methods have been widely utilized, especially for problems that are known as large-scale problems. This work proposes a new spectral gradient coefficient obtained from a convex linear combination of two different gradient coefficients to solve unconstrained optimization problems. One of the most essential features ofour suggested strategy is to guarantee the suitable subsidence direction of the line search precision. Furthermore, the proposed strategy is more effective than previous conjugate gradient approaches and stationery, which have been observed in the test problem. However, when it is compared to other conjugate gradient methods, such as FR methods, the proposed method confirmed the globally convergent, indicating that it can be used in scientific data computation.
In this article, we try to proposed a new conjugate gradient method for solving unconstrained optimization problems, we focus on conjugate gradient methods applied to the non-linear unconstrained optimization problems, the positive step size is obtained by a line search and the new scalar to the new direction for the conjugate gradient method is derived from the quadratic function and Taylor series and by using quasi newton condition and Newton direction while deriving the new formulae. We also prove that the search direction of the new conjugate gradient method satisfies the sufficient descent and all assumptions of the global convergence property are considered and proved .in order to complete the benefit of our research we should take into account studied the numerical results which are written in FORTRAN language when the objective function is compared our new algorithm with HS and PRP methods on the similar set of unconstrained optimization test problems which is very efficient and encouragement numerical results.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.