2009
DOI: 10.1016/j.cam.2008.07.013
|View full text |Cite
|
Sign up to set email alerts
|

Improved Newton’s method without direct function evaluations

Abstract: a b s t r a c tFor solving systems of nonlinear equations, we have recently developed a Newton's method to manage issues with inaccurate function values or problems with high computational cost. In this work we introduce a modification of the above method, reducing the total computational cost and improving, in general, its overall performance. Moreover, the proposed version retains the quadratic convergence, the good behavior over singular and ill-conditioned cases of Jacobian matrix, and its capability to be… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 22 publications
0
5
0
Order By: Relevance
“…which has also been used in [16] and ' i (y k ) as given in the previous subsection. By using Taylor's formula, the linear approximations of g i (x), for i 2 I r , around the current point x k , lead to the following linear equations:…”
Section: Optimization 5 987mentioning
confidence: 99%
See 1 more Smart Citation
“…which has also been used in [16] and ' i (y k ) as given in the previous subsection. By using Taylor's formula, the linear approximations of g i (x), for i 2 I r , around the current point x k , lead to the following linear equations:…”
Section: Optimization 5 987mentioning
confidence: 99%
“…At this point, we describe a way of extracting such points, which are named pivot points. More details on them can be found in [9,11,14,16]. …”
Section: Algorithm 1 the Framework Of New Direction Via An Approximatmentioning
confidence: 99%
“…. In order to obtain a diminishing direction, each component of q(x k ) must take the minimum value, and the approximation q i (x k ) of the gradient component g i (x k ) are constructed by selecting an auxiliary point from the zero contour of g i (x k ) which are named pivot point [31], [32]. A gradient component g i (x k ) corresponds to the pivot point x k i of g i (x).…”
Section: B Cag (Componentwise Approximated Gradient) Directionmentioning
confidence: 99%
“…If the number of cases when s k is defined using ( 15) is finite, that is, s k is defined by ( 14) for all sufficiently large k, then the proof follows from Theorem 2.1 in view of (12). Assume that there is a subsequent k m → ∞ such that…”
Section: Global Convergencementioning
confidence: 99%
“…This method is one of the most popular methods due to its attractive quadratic convergence, but it depends on the initial point and sometimes the computation of the inverse of the Hessian could be time consuming [18]. A number of different modified Newton methods have been introduced to improve the performance of the Newton method [1,[8][9][10]12]. However, the global convergence of these methods are not always guaranteed.…”
Section: Introductionmentioning
confidence: 99%