2022
DOI: 10.48550/arxiv.2208.07095
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Newton-MR algorithm with complexity guarantees for nonconvex smooth unconstrained optimization

Abstract: In this paper, we consider variants of Newton-MR algorithm, initially proposed in [62], for solving unconstrained, smooth, but non-convex optimization problems. Unlike the overwhelming majority of Newton-type methods, which rely on conjugate gradient algorithm as the primary workhorse for their respective sub-problems, Newton-MR employs minimum residual (MINRES) method. Recently, [51] establishes certain useful monotonicity properties of MINRES as well as its inherent ability to detect non-positive curvature d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

2
25
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(27 citation statements)
references
References 38 publications
2
25
0
Order By: Relevance
“…In this paper, we design and analyze inexact variants of nonconvex Newton-MR, initially proposed in [43], where the Hessian is approximated (Algorithms 4 and 5). We consider an additive noise model to the Hessian matrix, that is,…”
Section: Contributionmentioning
confidence: 99%
See 4 more Smart Citations
“…In this paper, we design and analyze inexact variants of nonconvex Newton-MR, initially proposed in [43], where the Hessian is approximated (Algorithms 4 and 5). We consider an additive noise model to the Hessian matrix, that is,…”
Section: Contributionmentioning
confidence: 99%
“…second-order methods [20,46]. Among these latter class of algorithms are methods with globalization strategies based on trust region [23,62] and cubic regularization [18,19,45,62], as well as those employing line-search such as variants of the classical Newton-CG [53,65], methods based on conjugate residual [24], and more recently Newton-MR variants [41,43,51], which rely on minimum residual (MINRES) sub-problem solver. Arguably, operations involving the Hessian matrix, such as matrix-vector products, are often the primary source of computational costs in second-order algorithms.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations