Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence 2019
DOI: 10.24963/ijcai.2019/157
|View full text |Cite
|
Sign up to set email alerts
|

Entropy-Penalized Semidefinite Programming

Abstract: Low-rank methods for semi-definite programming (SDP) have gained a lot of interest recently, especially in machine learning applications. Their analysis often involves determinant-based or Schattennorm penalties, which are difficult to implement in practice due to high computational efforts. In this paper, we propose Entropy-Penalized Semi-Definite Programming (EP-SDP) 1 , which provides a unified framework for a broad class of penalty functions used in practice to promote a low-rank solution. We show that EP-… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
6
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
3
3
1

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(6 citation statements)
references
References 25 publications
0
6
0
Order By: Relevance
“…(QP) Relaxation SOCP Relaxations [74] (SDP) Relaxation [53,65,75] Entropy-penalized SDP [76] Sparse Moment SDP [73] Moment SDP [72] Classical pre-processing Quantum circuit (QC) 0.5-Approximation [77,78] Random hyperplane [53,75] Iterative [79,80] Iterative [81] Measurement in a QC We now elaborate on the example of Goemans-Williamson random-hyperplane rounding of (SDP). For a given GW cut we generate an initial state using Y -rotations with a ε ∈ (0, 0.5), as discussed in Sec.…”
Section: Variantmentioning
confidence: 99%
See 1 more Smart Citation
“…(QP) Relaxation SOCP Relaxations [74] (SDP) Relaxation [53,65,75] Entropy-penalized SDP [76] Sparse Moment SDP [73] Moment SDP [72] Classical pre-processing Quantum circuit (QC) 0.5-Approximation [77,78] Random hyperplane [53,75] Iterative [79,80] Iterative [81] Measurement in a QC We now elaborate on the example of Goemans-Williamson random-hyperplane rounding of (SDP). For a given GW cut we generate an initial state using Y -rotations with a ε ∈ (0, 0.5), as discussed in Sec.…”
Section: Variantmentioning
confidence: 99%
“…For example, the (QP) relaxation can be seen as a second-order cone programming (SOCP) relaxation, and could be strengthened iteratively [74], until its objective-function value coincides with the objective-function value of the non-convex (QUBO), albeit at the cost of an exponential growth of the relaxation. Similarly, one could strengthen the (SDP) relaxation either by using an entropy-penalizing term [76] or by using the Moment/SOS hierarchy [72] and its sparse variant [73].…”
Section: Further Variants Of Warm-starting Quantum Optimizationmentioning
confidence: 99%
“…(QP) Relaxation SOCP Relaxations [72] (SDP) Relaxation [50,63,73] Entropy-penalized SDP [74] Sparse Moment SDP [71] Moment SDP [70] Classical pre-processing Quantum circuit (QC) 0.5-Approximation [75,76] Random hyperplane [50,73] Iterative [77,78] Iterative [79] Measurement in a QC WS-QAOA WS-RQAOA…”
Section: Variantmentioning
confidence: 99%
“…For example, the (QP) relaxation can be seen as a second-order cone programming (SOCP) relaxation, and could be strengthened iteratively [72], until its objectivefunction value coincides with the objective-function value of the non-convex (QUBO), albeit at the cost of an exponential growth of the relaxation. Similarly, one could strengthen the (SDP) relaxation either by using an entropy-penalizing term [74] or by using the Moment/SOS hierarchy [70] and its sparse variant [71], which converge faster than the SOCP hierarchy [72], from a stronger basic relaxation.…”
Section: Further Variants Of Warm-starting Quantum Optimizationmentioning
confidence: 99%
“…The non-negative constraints on the semidefinite variable can be further relaxed into the non-negative constraints on the factorized variable. To solve the original problem better, h(X) can be the entropic penality function [25,43,35] to find a low-rank solution.…”
mentioning
confidence: 99%