2022
DOI: 10.11591/ijeecs.v28.i1.pp551-558
|View full text |Cite
|
Sign up to set email alerts
|

A new three-term conjugate gradient method for training neural networks with global convergence

Abstract: Conjugate gradient methods (CG) constitute excellent neural network training methods that are simplicity, flexibility, numerical efficiency, and low memory requirements. In this paper, we introduce a new three-term conjugate gradient method, for solving optimization problems and it has been tested on artificial neural networks (ANN) for training a feed-forward neural network. The new method satisfied the descent condition and sufficient descent condition. Global convergence of the new (NTTCG) method has been t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 20 publications
0
1
0
Order By: Relevance
“…Where 𝑦 𝑖 = 𝑔 𝑖+1 βˆ’ 𝑔 𝑖 and 𝑠 𝑖 = πœ‘ 𝑖 𝑝 𝑖 = π‘₯ 𝑖+1 βˆ’ π‘₯ 𝑖 . Furthermore, the design of CG-techniques had been studied by many researchers; for more details see [9]- [14].…”
mentioning
confidence: 99%
“…Where 𝑦 𝑖 = 𝑔 𝑖+1 βˆ’ 𝑔 𝑖 and 𝑠 𝑖 = πœ‘ 𝑖 𝑝 𝑖 = π‘₯ 𝑖+1 βˆ’ π‘₯ 𝑖 . Furthermore, the design of CG-techniques had been studied by many researchers; for more details see [9]- [14].…”
mentioning
confidence: 99%