Background: A new optimization algorithm is presented. The method is biased with new non monotone line search an accelerated three term conjugate gradient method damped of Quasi Newton method compared to previews method design efficiency in provident of more than one factor for different optimization problem are more dramatic due to the ability of the technique to utile existing data. Materials and Methods: New monotone line search, new monotone line search, new modification of Damped Quasi-Newton method, Motivation and New Quasi-Newton Algorithm (MQ) and Global convergence. Results: In this work, we have n tendency to compare our new algorithm with same classical strategies like [7] by exploiting of unconstrained nonlinear optimization problem the functions obtained from Andrei [5, 6] Waziri and Sabiu (2015)[10] and La couzetul (2004)[3]. The numerical experiments demonstrate the performance of the proposed method. We selected seven relatively unconstrained problems with the size varies from 10 to 100. We consider the three sizes of each problem so that the total number of problem is 21 test problems. We stop the iteration when is satisfied All codes were written in Matlab R2017a and run on a pc with Intel COREi4 with a processor with 4GB of Ram and CPU 2.3GHZ we solved test problems using two different initial starting points. Conclusion: In this research article, a project on an accelerated three-term efficient algorithm for numerical optimization has presented the method as completely a derivative-free algorithm with less NOI and NOF and CPU time computed to the existing methods .using classical assumption the global convergence was also proved. Numerical results using the three terms efficient algorithm show that the algorithm is promising.
In the present study, we proposed a three-term of preconditioned gradient memory algorithms to solve a nonlinear optimization problem. The new algorithm subsumes some other families of nonlinear preconditioned gradient memory algorithms as its subfamilies with Powell's Restart Criterion and inexact Armijo line searches. Numerical experiments on twenty one well-known test functions with various dimensions generally encouraged and showed that the new algorithm was more stable and efficient in comparison with the standard three-term CG- algorithm
In this work we present a new algorithm of gradient descent type, in which the stepsize is computed by means of simple approximation of the Hessian Matrix to solve nonlinear unconstrained optimization function. The new proposed algorithm considers a new approximation of the Hessian based on the function values and its gradients in two successive points along the iterations one of them use Biggs modified formula to locate the new points. The corresponding algorithm belongs to the same class of superlinear convergent descent algorithms and it has been newly programmed to obtain the numerical results for a selected class of nonlinear test functions with various dimensions. Numerical experiments show that the new choice of the step-length required less computation work and greatly speeded up the convergence of the gradient algorithm especially, for large scaled unconstrained optimization problems.
A new hybrid quasi-Newton search direction ( HQN<sup>EI</sup> ) is proposed. It uses the update formula of Broyden–Fletcher–Goldfarb–Shanno (BFGS) with a certain conjugate gradient (CG) parameter by a nested direction. The global convergence analysis and superlinear rate, addtionaly with sufficient descent are proved using exact line search. Finally, the computation comparisons are made with original hybrid parents; BFGS and CG, through the efficiency in terms of iteration numbers and CPU-running time showing the superior of HQN<sup>EI</sup>. Therefore, the results marked preference of HQN<sup>EI</sup> from other two producer algorithms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.