In this paper, we suggest a new method for the numerical solution of fuzzy nonlinear equations in parametric form using a new Conjugate Gradient Technique. Table of the numerical solution is given to show the efficiency of the proposed method and which is compared with classical algorithms such as (Fletcher and Reeves (FR), Polak and Ribiere (PRP), and Fletcher (CD)) techniques.
Conjugate gradient methods constitute excellent neural network training methods, because of their simplicity, numerical efficiency and their very low memory requirements. It is wellknown that the procedure of training a neural network is highly consistent with unconstrained optimization theory and many attempts have been made to speed up this process. In particular, various algorithms motivated from numerical optimization theory have been applied for accelerating neural network training. In this paper, we propose a conjugate gradient neural network training algorithm by using Aitken's process which guarantees sufficient descent with Wolfe line search. Moreover, we establish that our proposed method is globally convergent for general functions under the strong Wolfe conditions. In the experimental results, we compared the behavior of our proposed method(NACG) with well-known methods in this field.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.