2013
DOI: 10.1080/00036811.2013.859251
|View full text |Cite
|
Sign up to set email alerts
|

A new class of alternative theorems for SOS-convex inequalities and robust optimization

Abstract: In this paper, we present a new class of alternative theorems for SOS-convex inequality systems without any qualifications. This class of theorems provides an alternative equations in terms of sums of squares to the solvability of the given inequality system. A strong separation theorem for convex sets, described by convex polynomial inequalities, plays a key role in establishing the class of alternative theorems. Consequently, we show that the optimal values of various classes of robust convex optimization pr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
5
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
8

Relationship

3
5

Authors

Journals

citations
Cited by 19 publications
(6 citation statements)
references
References 20 publications
1
5
0
Order By: Relevance
“…(3) In Section 5, we briefly outline how our results can be applied to show that robust SOSconvex optimization problems under restricted spectrahedron data uncertainty enjoy exact semi-definite programming relaxations. This extends the existing result for restricted ellipsoidal data uncertainty established in [13] and answers the open questions left in [13] on how to recover a robust solution from the semi-definite programming relaxation in this broader setting.…”
Section: Introductionsupporting
confidence: 76%
See 2 more Smart Citations
“…(3) In Section 5, we briefly outline how our results can be applied to show that robust SOSconvex optimization problems under restricted spectrahedron data uncertainty enjoy exact semi-definite programming relaxations. This extends the existing result for restricted ellipsoidal data uncertainty established in [13] and answers the open questions left in [13] on how to recover a robust solution from the semi-definite programming relaxation in this broader setting.…”
Section: Introductionsupporting
confidence: 76%
“…In this section, we briefly outline how our results can be applied to the area of robust optimization [4] (for some recent development see [8,9,12,13]). Consider the following robust SOS-convex optimization problem…”
Section: Applications To Robust Optimizationmentioning
confidence: 99%
See 1 more Smart Citation
“…When it comes to deriving mathematical principles of optimality or duality for optimization problems, there is no better mathematical tool than the Farkas lemma or its generalizations [2,10,9,12,14,16,22]. The hundred years old celebrated Farkas lemma is a fundamental result for systems of linear inequalities.…”
Section: Introductionmentioning
confidence: 99%
“…T (x − y) is a sum-of-squares polynomial in (x, y) on R n × R n (see [1,18]) or, equivalently, if its Hessian matrix ∇ 2 f is SOS matrix polynomial (i.e.,…”
Section: Exact Formulas For Radius Of Robust Feasibilitymentioning
confidence: 99%