2020
DOI: 10.1007/s10107-020-01470-9
|View full text |Cite
|
Sign up to set email alerts
|

High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms

Abstract: This paper studies high-order evaluation complexity for partially separable convexlyconstrained optimization involving non-Lipschitzian group sparsity terms in a nonconvex objective function. We propose a partially separable adaptive regularization algorithm using a p-th order Taylor model and show that the algorithm can produce an (ǫ, δ)approximate q-th-order stationary point at most O(ǫ −(p+1)/(p−q+1) ) evaluations of the objective function and its first p derivatives (whenever they exist). Our model uses th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
15
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(15 citation statements)
references
References 38 publications
0
15
0
Order By: Relevance
“…After [2], several high-order methods have been proposed and analysed for nonconvex optimization (see, e.g. [9][10][11]23]), resulting even in worst-case complexity bounds for the number of iterations that p-order methods need to generate approximate qth order stationary points [7,8].…”
Section: Motivationmentioning
confidence: 99%
See 2 more Smart Citations
“…After [2], several high-order methods have been proposed and analysed for nonconvex optimization (see, e.g. [9][10][11]23]), resulting even in worst-case complexity bounds for the number of iterations that p-order methods need to generate approximate qth order stationary points [7,8].…”
Section: Motivationmentioning
confidence: 99%
“…If ν is unknown, by (11) we set α = 1 in Algorithm 1. The resulting algorithm is a universal scheme that can be viewed as a generalization of the universal second-order method (6.10) in [16].…”
Section: Remark 31mentioning
confidence: 99%
See 1 more Smart Citation
“…
It is well known that finding a global optimum is extremely challenging for nonconvex optimization. There are some recent efforts [1,[12][13][14] regarding the optimization methods for computing higherorder critical points, which can exclude the so-called degenerate saddle points and reach a solution with better quality. Desipte theoretical development in [1,[12][13][14], the corresponding numerical experiments are missing.
…”
mentioning
confidence: 99%
“…There are some recent efforts [1,[12][13][14] regarding the optimization methods for computing higherorder critical points, which can exclude the so-called degenerate saddle points and reach a solution with better quality. Desipte theoretical development in [1,[12][13][14], the corresponding numerical experiments are missing. In this paper, we propose an implementable higher-order method, named adaptive high order method (AHOM), that aims to find the third-order critical points.…”
mentioning
confidence: 99%