2017
DOI: 10.1007/s10589-017-9925-6
|View full text |Cite
|
Sign up to set email alerts
|

Using negative curvature in solving nonlinear programs

Abstract: Minimization methods that search along a curvilinear path composed of a non-ascent negative curvature direction in addition to the direction of steepest descent, dating back to the late 1970s, have been an effective approach to finding a stationary point of a function at which its Hessian is positive semidefinite. For constrained nonlinear programs arising from recent applications, the primary goal is to find a stationary point that satisfies the second-order necessary optimality conditions. Motivated by this,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
14
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
2
2

Relationship

1
6

Authors

Journals

citations
Cited by 13 publications
(14 citation statements)
references
References 33 publications
(47 reference statements)
0
14
0
Order By: Relevance
“…Because of this, a small-stepping descent method will not move far away from Σ 4θp0 . For concreteness, we will analyze a variant of the curvilinear search method [Gol80,GMWZ17], which moves in a linear combination of the negative gradient direction −g and a negative curvature direction −v. At the k-th iteration, the algorithm updates a (k+1) as…”
Section: Provable Algorithm For Sas Deconvolutionmentioning
confidence: 99%
“…Because of this, a small-stepping descent method will not move far away from Σ 4θp0 . For concreteness, we will analyze a variant of the curvilinear search method [Gol80,GMWZ17], which moves in a linear combination of the negative gradient direction −g and a negative curvature direction −v. At the k-th iteration, the algorithm updates a (k+1) as…”
Section: Provable Algorithm For Sas Deconvolutionmentioning
confidence: 99%
“…Second-order methods over a Riemannian manifold are known to be able to escape saddle points, for example, the trust region method [60], and the negative curvature method [52]. Recent works proposed to solve dictionary learning [42], and phase retrieval [40] using these methods, without any special initialization schemes.…”
Section: A Guaranteed First-order Optimization Algorithmmentioning
confidence: 99%
“…B Giovanni Fasano fasano@unive.it Extended author information available on the last page of the article Despite the use of the term 'minimization' in the last problem, most of the methods proposed in the literature (for its solution) generate a sequence of points {x k }, which is only guaranteed to converge to stationary points. Thus, specific methods need to be applied in case stationary points for the above problem, satisfying also second-order necessary optimality conditions, are sought (see, for instance, the seminal papers [1][2][3][4][5][6][7] in the framework of truncated Newton methods). Observe that additional care when using the latter methods is definitely mandatory, since imposing standard first-order stationarity conditions may not in general ensure convexity of the quadratic model of the objective function, in a neighborhood of the solution points.…”
Section: Introductionmentioning
confidence: 99%
“…Observe that additional care when using the latter methods is definitely mandatory, since imposing standard first-order stationarity conditions may not in general ensure convexity of the quadratic model of the objective function, in a neighborhood of the solution points. In this regard, the computation of so-called negative curvature directions for the objective function is an essential tool (see also the recent papers [4,8]), to guarantee convergence to stationary points which satisfy second-order necessary conditions.…”
Section: Introductionmentioning
confidence: 99%