1999
DOI: 10.1007/s002119900080
|View full text |Cite
|
Sign up to set email alerts
|

Global convergence of the method of shortest residuals

Abstract: The method of shortest residuals (SR) was presented by Hestenes and studied by Pytlak. If the function is quadratic, and if the line search is exact, then the SR method reduces to the linear conjugate gradient method. In this paper, we put forward the formulation of the SR method when the line search is inexact. We prove that, if stepsizes satisfy the strong Wolfe conditions, both the Fletcher-Reeves and Polak-Ribière-Polyak versions of the SR method converge globally. When the Wolfe conditions are used, the t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0

Year Published

2002
2002
2011
2011

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 13 publications
(16 citation statements)
references
References 15 publications
0
16
0
Order By: Relevance
“…It is claimed in [1] that the procedure stated in [2], which finds α k satisfying (3)-(4) in a finite number of operations, is not correct.…”
Section: The Counter-examplementioning
confidence: 97%
“…It is claimed in [1] that the procedure stated in [2], which finds α k satisfying (3)-(4) in a finite number of operations, is not correct.…”
Section: The Counter-examplementioning
confidence: 97%
“…In our numerical comparisons we also considered a line search rule based on the Wolfe conditions accommodating the box constraints-that directional minimization rule is an extension of the rule stated in [12]. LSR2: find a positive number α k such that…”
Section: Lemma 1 Letx Be a Feasible Point Thenx Is A Critical Point mentioning
confidence: 99%
“…has striking resemblance to the Polak-Ribière formula and has not only superior numerical properties but also convergence properties better than that of all existing versions of the Polak-Ribière algorithm (see, e.g., [2], [12], [13], [16], [18] and [29]). In [28] the preconditioned version of the algorithm is introduced.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In [9] (see also [4]) a new family of conjugate gradient algorithms has been introduced whose direction finding subproblem is given by where N r { a , b ) is defined as the point from a line segment spanned by the vectors a and b which has the smallest norm, i.e.,…”
Section: Introductionmentioning
confidence: 99%