2014
DOI: 10.48550/arxiv.1411.2129
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Interior-point algorithms for convex optimization based on primal-dual metrics

Abstract: We propose and analyse primal-dual interior-point algorithms for convex optimization problems in conic form. The families of algorithms we analyse are so-called shortstep algorithms and they match the current best iteration complexity bounds for primal-dual symmetric interior-point algorithm of Nesterov and Todd, for symmetric cone programming problems with given self-scaled barriers. Our results apply to any self-concordant barrier for any convex cone. We also prove that certain specializations of our algorit… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(16 citation statements)
references
References 29 publications
0
16
0
Order By: Relevance
“…Our approach (and in general interior-point methods) returns more robust certificates in provably stronger (polynomial) iteration complexity bounds compared to first-order methods such as Douglas-Rachford splitting [14], at the price of higher computational cost per iteration. However, as explained in [9], the quasi-Newton type ideas for deriving suitable primal-dual local metrics in [40,24] can be used to make our algorithm scalable, while preserving some primal-dual symmetry. The rest of the article covers:…”
Section: Infeasiblementioning
confidence: 99%
See 1 more Smart Citation
“…Our approach (and in general interior-point methods) returns more robust certificates in provably stronger (polynomial) iteration complexity bounds compared to first-order methods such as Douglas-Rachford splitting [14], at the price of higher computational cost per iteration. However, as explained in [9], the quasi-Newton type ideas for deriving suitable primal-dual local metrics in [40,24] can be used to make our algorithm scalable, while preserving some primal-dual symmetry. The rest of the article covers:…”
Section: Infeasiblementioning
confidence: 99%
“…Based on the insights we have gained by our performance analyses of the PtPCA algorithm in detecting the possible statuses for a given problem, we can discuss the stopping criteria and returned certificates by this algorithm in a practical setup. Even though applications of interiorpoint methods beyond the scope of symmetric cones have been studied [40,28,36,24,1], there is no well-stablished software close to optimization in the Domain-Driven from. Let us review the existing stopping criteria for some well-known optimization over symmetric cones solvers (using the formulation in ( 5)).…”
Section: Stopping Criteria and Conclusionmentioning
confidence: 99%
“…As such, hyperbolic optimization problems can be solved using interior point methods as long as the polynomial p can be evaluated efficiently. More recently, other algorithmic approaches to solving hyperbolic optimization problems have been developed, including primal-dual interior point methods [MT14], affine scaling methods [RS14], firstorder methods based on applying a subgradient method to a transformation of the problem [Ren16] and accelerated modifications tailored for hyperbolic programs [Ren19].…”
Section: Hyperbolic Programmingmentioning
confidence: 99%
“…However, the applications and software for conic optimization itself have not gone much beyond optimization over symmetric (self-scaled) cones; more specifically linear programming (LP), second-order cone programming (SOCP), and semidefinite programming (SDP). Some of the desired properties of optimization over symmetric cones have been extended to general conic optimization [49,38,48,33]. While the conic reformulation implies that, under reasonable assumptions, all convex optimization problems enjoy the same iteration complexity bounds, there is a gap (remained unchanged for many years) between the efficiency and robustness of the software we have for optimization over symmetric cones and many other classes of problems.…”
Section: Introductionmentioning
confidence: 99%