2020
DOI: 10.46300/9102.2020.14.20
|View full text |Cite
|
Sign up to set email alerts
|

New Two-step Conjugate Gradient Method for Unconstrained Optimization

Abstract: Two-step methods are secant-like techniques of the quasi-Newton type that, unlike the classical methods, construct nonlinear alternatives to the quantities used in the so-called Secant equation. Two-step methods instead incorporate data available from the two most recent iterations and thus create an alternative to the Secant equation with the intention of creating better Hessian approximations that induce faster convergence to the minimizer of the objective function. Such methods, based on reported numerical … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
3
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 24 publications
0
3
0
Order By: Relevance
“…They proved to be really effective in practice and showed mature convergence properties. This idea was presented as a technical method in some manuscripts, for example, to maximize in the advantages of the original respective conjugate gradient methods [7], [14]- [16]. To introduce a new method, we will find the Hessian approximation of the minimum of a function f(u), which give a new search direction and choose the coefficient conjugate satisfies above relation.…”
Section: βˆ’π‘„ π‘˜+1mentioning
confidence: 99%
See 2 more Smart Citations
“…They proved to be really effective in practice and showed mature convergence properties. This idea was presented as a technical method in some manuscripts, for example, to maximize in the advantages of the original respective conjugate gradient methods [7], [14]- [16]. To introduce a new method, we will find the Hessian approximation of the minimum of a function f(u), which give a new search direction and choose the coefficient conjugate satisfies above relation.…”
Section: βˆ’π‘„ π‘˜+1mentioning
confidence: 99%
“…For general nonlinear functions, it is necessary to use an iterative procedure [6]. We use the typical Wolfe criteria to determine the step length throughout our operation, as shown in ( 6) and (7):…”
mentioning
confidence: 99%
See 1 more Smart Citation