“…The computational experiment is based on the number of iterations and CPU time. To ascertain the global convergence of the proposed method, the benchmarks problems in [21,22] was used with two different ISP, and the output of the performance of the methods was based on the performance profile presented by Dolan and More [24]. The performance profile : → [0,1] is defined as follows: Let and be the set of problems and set of solvers respectively.…”
Mathematical models from recent research are mostly nonlinear equations in nature. Numerical solutions to such systems are widely needed and applied in those areas of mathematics. Although, in recent years, this field received serious attentions and new approach were discovered, but yet the efficiency of the previous versions suffers setback. This article gives a new hybrid conjugate gradient parameter, the method is derivative-free and analyzed with an effective inexact line search in a given conditions. Theoretical proofs show that the proposed method retains the sufficient descent and global convergence properties of the original CG methods. The proposed method is tested on a set of test functions, then compared to the two previous classical CG-parameter that resulted the given method, and its performance is given based on number of iterations and CPU time. The numerical results show that the new proposed method is efficient and effective amongst all the methods tested. The graphical representation of the result justify our findings. The computational result indicates that the new hybrid conjugate gradient parameter is suitable and capable for solving symmetric systems of nonlinear equations.
“…The computational experiment is based on the number of iterations and CPU time. To ascertain the global convergence of the proposed method, the benchmarks problems in [21,22] was used with two different ISP, and the output of the performance of the methods was based on the performance profile presented by Dolan and More [24]. The performance profile : → [0,1] is defined as follows: Let and be the set of problems and set of solvers respectively.…”
Mathematical models from recent research are mostly nonlinear equations in nature. Numerical solutions to such systems are widely needed and applied in those areas of mathematics. Although, in recent years, this field received serious attentions and new approach were discovered, but yet the efficiency of the previous versions suffers setback. This article gives a new hybrid conjugate gradient parameter, the method is derivative-free and analyzed with an effective inexact line search in a given conditions. Theoretical proofs show that the proposed method retains the sufficient descent and global convergence properties of the original CG methods. The proposed method is tested on a set of test functions, then compared to the two previous classical CG-parameter that resulted the given method, and its performance is given based on number of iterations and CPU time. The numerical results show that the new proposed method is efficient and effective amongst all the methods tested. The graphical representation of the result justify our findings. The computational result indicates that the new hybrid conjugate gradient parameter is suitable and capable for solving symmetric systems of nonlinear equations.
Section: List Of Benchmark Test Problem Usedmentioning
confidence: 99%
“…( ) = 0, ∈ ; (Eq 1) where : → is continuously differentiable. Newton and quasi-Newton methods are the most widely used methods to solve such problems because they have very attractive convergence properties and practical application (see [1,2,3,4]). However, they are not usually suitable for large-scale nonlinear systems of equations because they require Jacobian matrix, or an approximation to it, at every iteration while solving optimization problems.…”
In mathematical term, the method of solving models and finding the best alternatives is known as optimization. Conjugate gradient (CG) method is an evolution of computational method in solving optimization problems. In this article, an alternative modified conjugate gradient coefficient for solving large-scale nonlinear system of equations is presented. The method is an improved version of the Rivaie et el conjugate gradient method for unconstrained optimization problems. The new CG is tested on a set of test functions under exact line search. The approach is easy to implement due to its derivative-free nature and has been proven to be effective in solving real-life application. Under some mild assumptions, the global convergence of the proposed method is established. The new CG coefficient also retains the sufficient descent condition. The performance of the new method is compared to the well-known previous PRP CG methods based on number of iterations and CPU time. Numerical results using some benchmark problems show that the proposed method is promising and has the best efficiency amongst all the methods tested.
“…Furthermore, the search direction is generally required to satisfy the descent condition ∇ ( ) < 0. The derivative-free direction can be obtained in several ways [4,5,7,9,12]. An iterative method that generates a sequence { } satisfying (3) or (5) is called a norm descent method.…”
An algorithm for solving large-scale systems of nonlinear equations based on the transformation of the Newton method with the line search into a derivative-free descent method is introduced. Main idea used in the algorithm construction is to approximate the Jacobian by an appropriate diagonal matrix. Furthermore, the step length is calculated using inexact line search procedure. Under appropriate conditions, the proposed method is proved to be globally convergent under mild conditions. The numerical results presented show the efficiency of the proposed method.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.