2018
DOI: 10.1137/16m1108145
|View full text |Cite
|
Sign up to set email alerts
|

Line Search Algorithms for Locally Lipschitz Functions on Riemannian Manifolds

Abstract: This paper presents line search algorithms for finding extrema of locally Lipschitz functions defined on Riemannian manifolds. To this end we generalize the so-called Wolfe conditions for nonsmooth functions on Riemannian manifolds. Using ε-subgradient-oriented descent directions and the Wolfe conditions, we propose a nonsmooth Riemannian line search algorithm and establish the convergence of our algorithm to a stationary point. Moreover, we extend the classical BFGS algorithm to nonsmooth functions on Riemann… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
40
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 59 publications
(40 citation statements)
references
References 26 publications
0
40
0
Order By: Relevance
“…As subsolver for RALM and the smoothing methods REPMS, we use LRBFGS: a Riemannian, limited-memory BFGS [36] as implemented in Manopt [18]. For REPMSD, we use a minimum-norm tangent vector in the extended subgradient, then we use a type of LRBFGS inverse Hessian approximation for Hessian updates to choose the update direction [33]. For these limited-memory subsolvers, we let the memory be 30, the maximum number of iterations be 200, and the minimum step size be 10 −10 .…”
Section: Dataset Name Number Of Data Features Clustersmentioning
confidence: 99%
“…As subsolver for RALM and the smoothing methods REPMS, we use LRBFGS: a Riemannian, limited-memory BFGS [36] as implemented in Manopt [18]. For REPMSD, we use a minimum-norm tangent vector in the extended subgradient, then we use a type of LRBFGS inverse Hessian approximation for Hessian updates to choose the update direction [33]. For these limited-memory subsolvers, we let the memory be 30, the maximum number of iterations be 200, and the minimum step size be 10 −10 .…”
Section: Dataset Name Number Of Data Features Clustersmentioning
confidence: 99%
“…A concise survey of Newton based methods can be consulted in [14]. Since step lengths in (20) are based on first or second order Taylor polynomials, the step size can be chosen via line search [15] and/or trust region [16] methods. Thus, we can ensure global convergence of optimization methods to stationary points of the cost function (1).…”
Section: Line Search Optimization Methodsmentioning
confidence: 99%
“…It is possible to develop new minimizers by making particular choices for the Riemannian operators in Alg 2. We now review the Armijo variant of the Riemannian gradient descent [40], that is a common and probably the simplest choice for an accelerated optimizer on the manifold. Though, many other line-search conditions such as Barzilai-Borwein [47] or strong Wolfe [70] can be used.…”
Section: A2 Riemannian Descent and Line Searchmentioning
confidence: 99%