2021 IEEE Engineering International Research Conference (EIRCON) 2021
DOI: 10.1109/eircon52903.2021.9613264
|View full text |Cite
|
Sign up to set email alerts
|

An Overview on Conjugate Gradient Methods for Optimization, Extensions and Applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 28 publications
0
2
0
Order By: Relevance
“…, Implying that, Qk can be expressed convexly as a combination of f (xk ) and Qk−1, consequently, This signifies that ηk−1 = 0, since Pk−1 and Qk−1 are both non-zero. Therefore, the non-monotone line search turns monotone that is, equation (8)…”
Section: H1 the Set Of Pointsmentioning
confidence: 99%
See 1 more Smart Citation
“…, Implying that, Qk can be expressed convexly as a combination of f (xk ) and Qk−1, consequently, This signifies that ηk−1 = 0, since Pk−1 and Qk−1 are both non-zero. Therefore, the non-monotone line search turns monotone that is, equation (8)…”
Section: H1 the Set Of Pointsmentioning
confidence: 99%
“…In this context, the gradient and Hessian of the function (1) at the point x are represented by g(x) = ∇f (x) and B = ∇ 2 f (x), respectively. CG is frequently utilized among the well-known iterative algorithms employed to solve the optimization problem (1) [8]. This method updates its sequence of iterations using the formula below,…”
Section: Introductionmentioning
confidence: 99%