2023
DOI: 10.1016/j.exmath.2023.04.004
|View full text |Cite
|
Sign up to set email alerts
|

Introducing memory to a family of multi-step multidimensional iterative methods with weight function

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
13
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(13 citation statements)
references
References 11 publications
0
13
0
Order By: Relevance
“…To avoid the generally expensive computation of the Fréchet derivative P ′ (x k ) of the operator P and increase the convergence to be higher than one other ITs have been developed [1,[3][4][5][6][7] using divided differences of order one [1,2]. The methods of chords or Regula falsi or the Secant method are some of the most used ITs for solving Equation (1).…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…To avoid the generally expensive computation of the Fréchet derivative P ′ (x k ) of the operator P and increase the convergence to be higher than one other ITs have been developed [1,[3][4][5][6][7] using divided differences of order one [1,2]. The methods of chords or Regula falsi or the Secant method are some of the most used ITs for solving Equation (1).…”
Section: Introductionmentioning
confidence: 99%
“…which is also of convergence order two [4,5]. But the divided difference [•, •; P] is used instead of the derivative P ′ .…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Cordero et al constructed a derivative-free multi-step iterative scheme based on Steffensen's method. In fact, to avoid excessively increasing the number of functional evaluations and, at the same time, to increase the order of convergence, the divided differences and a weight function on already evaluated functions were used [23].…”
Section: Introductionmentioning
confidence: 99%
“…These new schemes can be with or without memory, increasing in the first case the convergence order without adding computational cost. Some references in this area are [10][11][12][13] and the references therein.…”
Section: Introductionmentioning
confidence: 99%