2020
DOI: 10.37418/amsj.9.7.61
|View full text |Cite
|
Sign up to set email alerts
|

A New Modification of NPRP Conjugate Gradient Method for Unconstrained Optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(11 citation statements)
references
References 0 publications
0
11
0
Order By: Relevance
“…We begin this section by showing that the new parameter satisfies the sufficient descent property. From [7], it follows that KMAR parameter will reduce to the following:…”
Section: Convergence Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We begin this section by showing that the new parameter satisfies the sufficient descent property. From [7], it follows that KMAR parameter will reduce to the following:…”
Section: Convergence Resultsmentioning
confidence: 99%
“…where f is a smooth function defined by f : R n → R whose gradient g(x) = ∇f (x) is always available [1]. Problem in the form (1) can be found in various specialized disciplines such as computer science, machine learning, neural network, engineering, statistics and many more (see [2][3][4][5][6][7]). For simplicity, the following abbreviations ∇(f (x k )) and f (x k ) would be represented by ∇ k and f k throughout this study and denote the Euclidean norm of vectors.…”
Section: Introductionmentioning
confidence: 99%
“…Property (*) Consider a CG algorithm generated using algorithm 1. Assume for all 𝑘 ≥ 0, we have 0 < 𝛾 ≤ ‖𝑔 𝑘 ‖ ≤ 𝛾 , (27) for some positive constants 𝛾 and 𝛾. Then, we say the method possesses the property (*), if…”
Section: Proofmentioning
confidence: 99%
“…However, the methods presented in (7) are influenced by jamming phenomena which affects their computational performance [16,21]. The convergence issues from the first set of CG methods defined in (6) and the poor numerical results from the second set (7) has led to numerous studies of the CG formulas [16,[22][23][24][25][26][27][28][29][30][31][32].…”
Section: Introductionmentioning
confidence: 99%
“…The CG algorithms are unconstrained optimization processes characterized by their simplicity, nice convergence properties, low memory requirements, and less computational cost in the iteration process (Malik et al, 2020). For more details on hybrid conjugate methods, refer to (Touati-Ahmed and Storey, 1990;Gilbert and Nocedal, 1992;Dai and Yuan, 2001;Andrei, 2008;Liu and Li, 2014;Yakubu et al, 2020;Malik et al, 2021;Sulaiman and Mamat, 2020a;Sulaiman et al, 2022a). This paper defines a new hybrid CG algorithm using an inexact line search procedure to solve intuitionistic fuzzy nonlinear problems.…”
Section: Introductionmentioning
confidence: 99%