2023
DOI: 10.1051/ro/2022213
|View full text |Cite
|
Sign up to set email alerts
|

A survey on the Dai–Liao family of nonlinear conjugate gradient methods

Abstract: At the beginning of this century, which is characterized by huge flows of emerging data, Dai and Liao proposed a pervasive conjugacy condition that triggered the interest of many optimization scholars. Recognized as a sophisticated conjugate gradient (CG) algorithm after about two decades, here we share our visions and thoughts on the method in the framework of a review study. In this regard, we first discuss the modified Dai–Liao methods based on the modified secant equations given in the literature, mostly w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
8
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 121 publications
0
8
0
Order By: Relevance
“…Most of these methods were developed by modifying the conjugate gradient parameter β DL k [2][3][4][5][6][7][8][9]. For more details, see the survey on the DL family of nonlinear CG methods in [10]. One of rules for defining β k is denoted as β MHSDL k and defined in [7] by…”
Section: Introduction and Background Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Most of these methods were developed by modifying the conjugate gradient parameter β DL k [2][3][4][5][6][7][8][9]. For more details, see the survey on the DL family of nonlinear CG methods in [10]. One of rules for defining β k is denoted as β MHSDL k and defined in [7] by…”
Section: Introduction and Background Resultsmentioning
confidence: 99%
“…Alternatively, for ν k (∆ k ) = 1, the equation ( 2) is considered as a conjugacy condition that implicitly satisfies the quasi-Newton characteristics. For more details on these cases, see [1,10].…”
Section: Fuzzy Neutrosophic Dai-liao Conjugate Gradient Methodsmentioning
confidence: 99%
“…Before exploring VOPs, let us consider some well-known CG parameters related to the natural unconstrained optimization problem, which focuses on minimizing The parameters include the β k of Polak-Ribiére–Polyak (PRP) [ 4 ], Hestenes-Stiefel (HS) [ 5 ], Dai–Liao (DL) [ 6 ], and Hager–Zhang (HZ) [ 7 , 8 ]. Other well-known CG methods include: a survey on DL [ 9 ], Fletcher-Reeves (FR) [ 10 ], Conjugate Descent (CD) [ 11 ], Dai-Yuan (DY) [ 12 ], and Liu-Storey (LS) [ 13 ]. In most cases, the convergence of the CG method based on these parameters is achieved only if the search direction attains a decent property or sufficient descent condition.…”
Section: Introductionmentioning
confidence: 99%
“…These methods are mainly divided into two categories: traditional optimization technology and meta-heuristic algorithm. Traditional optimization methods rely more on the known information of the problem to solve the deterministic problem effectively, such as the branch and bound algorithm [6], conjugate gradient method [7], steepest descent method [8], etc. Unlike these techniques, meta-heuristic algorithms obtain new optimization models by simulating certain natural phenomena or animal behaviors.…”
Section: Introductionmentioning
confidence: 99%