2020
DOI: 10.1007/s10589-020-00243-6
|View full text |Cite|
|
Sign up to set email alerts
|

Globalized inexact proximal Newton-type methods for nonconvex composite functions

Abstract: Optimization problems with composite functions consist of an objective function which is the sum of a smooth and a (convex) nonsmooth term. This particular structure is exploited by the class of proximal gradient methods and some of their generalizations like proximal Newton and quasi-Newton methods. The current literature on these classes of methods almost exclusively considers the case where also the smooth term is convex. Here we present a globalized proximal Newton-type method which allows the smooth term … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
38
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 25 publications
(39 citation statements)
references
References 44 publications
1
38
0
Order By: Relevance
“…By combing this with the first inequality in (20), it is immediate to deduce that ϑ ′ k (t) > 0 for t ∈ (0, tk ) and ϑ ′ k (t) < 0 for t ∈ ( tk , t k − δ).…”
Section: Remark 31 (A)mentioning
confidence: 93%
See 1 more Smart Citation
“…By combing this with the first inequality in (20), it is immediate to deduce that ϑ ′ k (t) > 0 for t ∈ (0, tk ) and ϑ ′ k (t) < 0 for t ∈ ( tk , t k − δ).…”
Section: Remark 31 (A)mentioning
confidence: 93%
“…To achieve the global convergence, a boundedness condition of the search directions (see (16)) is also required in [32,34,1]. We also notice that the inexact proximal Newton-type method in [22] was recently extended by Kanzow and Lechner [20] to solve the problem (2) with only a convex g, which essentially belongs to weakly convex optimization. Their global and local superlinear convergence results require the local strong convexity of Ψ around any stationary point.…”
Section: Related Workmentioning
confidence: 99%
“…Indeed, (P) can model linear and convex quadratic programming instances, regularized (group) lasso instances (often arising in signal or image processing and machine learning, e.g. see [14,62,69]), as well as sub-problems arising from the linearization of a nonlinear (possibly non-convex or non-smooth) problem (such as those arising within sequential quadratic programming [10] or globalized proximal Newton methods [38,39]). Furthermore, various optimal control problems can be tackled in the form of (P), such as those arising from L 1 -regularized partial differential equation (PDE) optimization, assuming that a discretize-then-optimize strategy is adopted (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…see [13,32,43,45,46,50,52,56,65]), variants of the proximal point method (e.g. see [20,24,25,38,39,41,49,59]), or interior point methods (IPMs) applied to a reformulation of (P) (e.g. see [21,28,31,51]).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation