The primarily objective of this paper which is indicated in the field of conjugate gradient algorithms for unconstrained optimization problems and algorithms is to show the advantage of the new proposed algorithm in comparison with the standard method which is denoted as. Hestenes Stiefel method, as we know the coefficient conjugate parameter is very crucial for this reason, we proposed a simple modification of the coefficient conjugate gradient which is used to derived the new formula for the conjugate gradient update parameter described in this paper. Our new modification is based on the conjugacy situation for nonlinear conjugate gradient methods which is given by the conjugacy condition for nonlinear conjugate gradient methods and added a nonnegative parameter to suggest the new extension of the method. Under mild Wolfe conditions, the global convergence theorem and lemmas are also defined and proved. The proposed method's efficiency is programming and demonstrated by the numerical instances, which were very encouraging.
In unconstrained optimization algorithms, we employ the memoryless quasi Newton procedure to construct a new conjugacy coefficient for the conjugate gradient approaches. This newer updating formula was adapted by scaling the well-known broyden fletcher glodfarb shanno (BFGS) formula by a selfscaling factor in order to reach to the new form of the conjugacy coefficient which makes a satisfactory result in the descent direction and satisfies the globally convergent features when compared the proposed method to HS standard conjugate gradient approach. The theorems are studied in detail and moreover the numerical results of this paper is depend on a Fortran programming which are extremely stable.
Using the development of the second approximation, a variation of the standard secant technique for nonlinear problems has been developed. The iterative formula is created using Taylor series expansion, which includes an estimate of the second derivative of Θ(μ). It is demonstrated that the new approaches have quadratic convergence. In comparison with the Newton technique employing seven test functions in the practical application, it turns out that the performance of the method is efficient.
Recently, the unconstrained optimization conjugate gradient methods have been widely utilized, especially for problems that are known as large-scale problems. This work proposes a new spectral gradient coefficient obtained from a convex linear combination of two different gradient coefficients to solve unconstrained optimization problems. One of the most essential features ofour suggested strategy is to guarantee the suitable subsidence direction of the line search precision. Furthermore, the proposed strategy is more effective than previous conjugate gradient approaches and stationery, which have been observed in the test problem. However, when it is compared to other conjugate gradient methods, such as FR methods, the proposed method confirmed the globally convergent, indicating that it can be used in scientific data computation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.