2013
DOI: 10.1016/j.amc.2013.09.038
|View full text |Cite
|
Sign up to set email alerts
|

A nonmonotone trust region method based on simple conic models for unconstrained optimization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2014
2014
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…The derivation of cost functions gradient and Hessian matrix is based on the following properties [21]: …”
Section: Appendix: the Derivation Of Gradient And Hessian Matrix Of Cmentioning
confidence: 99%
See 1 more Smart Citation
“…The derivation of cost functions gradient and Hessian matrix is based on the following properties [21]: …”
Section: Appendix: the Derivation Of Gradient And Hessian Matrix Of Cmentioning
confidence: 99%
“…Without a doubt, the iterative optimization algorithm plays an important role in whole process of solving the BSS problem. Below is the expression of the unconstrained optimization problem [12,19,21].…”
Section: Introductionmentioning
confidence: 99%
“…In the next experiments, we compare Algorithm NCGL with three-term CG method (MDL) [18], inexact Newton method [6] and Trust region method (NSCTR) [19], respectively. The numerical results are listed in Tables 2, 3 and 4.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…for which f : R n −→ R is a continuously differentiable function and bounded from below. There are many kinds of iterative methods to solve unconstrained optimization problems including the Newton method [28], the steepest descent method [29], the conjugate gradient methods [11,13,16,23], the quasi-Newton methods [9,[18][19][20], line search methods [9,22] and trust-region methods [26,34,37,40]. Conjugate gradient methods [13,16,22,24,30] are the powerful classes of iterative methods to solve unconstrained optimization problems, which are suitable especially for the large-scale problems due to the simplicity of their iterates and low memory requirements.…”
Section: Introductionmentioning
confidence: 99%