2012
DOI: 10.1007/s10107-012-0514-2
|View full text |Cite
|
Sign up to set email alerts
|

Nonsmooth optimization via quasi-Newton methods

Abstract: We investigate the behavior of quasi-Newton algorithms applied to minimize a nonsmooth function f , not necessarily convex. We introduce an inexact line search that generates a sequence of nested intervals containing a set of points of nonzero measure that satisfy the Armijo and Wolfe conditions if f is absolutely continuous along the line. Furthermore, the line search is guaranteed to terminate if f is semi-algebraic. It seems quite difficult to establish a convergence theorem for quasi-Newton methods applied… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
273
0

Year Published

2013
2013
2020
2020

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 318 publications
(276 citation statements)
references
References 39 publications
3
273
0
Order By: Relevance
“…A method based on the BFGS code HANSO v.2.1 available from http://www.cs.nyu.edu/overton/software/hanso/ and kindly made available to us by Michael Overton [17]. Recall that BFGS is a quasi-Newton method with a particular rank-two correction of the approximation of the Hessian at each iteration.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…A method based on the BFGS code HANSO v.2.1 available from http://www.cs.nyu.edu/overton/software/hanso/ and kindly made available to us by Michael Overton [17]. Recall that BFGS is a quasi-Newton method with a particular rank-two correction of the approximation of the Hessian at each iteration.…”
Section: Numerical Experimentsmentioning
confidence: 99%
“…For minimizing the post-processing objective function (Eq. 12), we used the BFGS-based quasi-Newton method implementation by Lewis and Overton (2012). All the experiments were carried out on an Ubuntu 11.04 Linux PC with a 2.8 GHz Intel Core i7 Quad-Core processor and 8 GB of memory.…”
Section: Methodsmentioning
confidence: 99%
“…For this refinement, we use a BFGS-based quasi-Newton method (Lewis and Overton, 2012) that only requires the value of the objective function and its gradient at each point. Letting X (0) = X SDP , we iteratively minimize the following objective function:…”
Section: Post-processingmentioning
confidence: 99%
“…The search direction is of type [40]. In the case of non-smooth functions, BFGS typically succeeds in finding a local minimizer, as indicated by Overton et al [41]. However, this requires some attention to the line search conditions.…”
Section: Appendix 1: the L-bfgs-b Algorithmmentioning
confidence: 99%