2021
DOI: 10.1007/s10107-021-01734-y
|View full text |Cite
|
Sign up to set email alerts
|

Ideal formulations for constrained convex optimization problems with indicator variables

Abstract: Motivated by modern regression applications, in this paper, we study the convexification of a class of convex optimization problems with indicator variables and combinatorial constraints on the indicators. Unlike most of the previous work on convexification of sparse regression problems, we simultaneously consider the nonlinear non-separable objective, indicator variables, and combinatorial constraints. Specifically, we give the convex hull description of the epigraph of the composition of a onedimensional con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(4 citation statements)
references
References 54 publications
0
4
0
Order By: Relevance
“…In [2], the authors consider the set S 2 with the additional constraint x ≤ 1, and obtain an extended formulation for the convex hull of this set that is SDP representable. More results regarding the convexification of Problem (MIQPI) can be found in [11,13,21,20].…”
Section: And Derive Conv(z −mentioning
confidence: 99%
“…In [2], the authors consider the set S 2 with the additional constraint x ≤ 1, and obtain an extended formulation for the convex hull of this set that is SDP representable. More results regarding the convexification of Problem (MIQPI) can be found in [11,13,21,20].…”
Section: And Derive Conv(z −mentioning
confidence: 99%
“…Atamtürk and Gómez [7] give the convex hull description of a rank-one function with free continuous variables, and propose an SDP formulation to tackle quadratic optimization problems with free variables arising in sparse regression. Wei et al [50,51] extend those results, deriving ideal formulations for rank-one functions with arbitrary constraints on the indicator variables x. These formulations are shown to be effective in sparse regression problems; however as they do not account for the non-negativity constraints on the continuous variables, they are weak for (1).…”
Section: Introductionmentioning
confidence: 97%
“…Atamtürk and Gómez [7] give the convex hull description of a rank-one function with free continuous variables, and propose an SDP formulation to tackle quadratic optimization problems with free variables arising in sparse regression. Wei et al [51,52] extend those results, deriving ideal formulations for rank-one functions with arbitrary constraints on the indicator variables x. These formulations are shown to be effective in sparse regression problems; however as they do not account for the non-negativity constraints on the continuous variables, they are weak for (1).…”
Section: Introductionmentioning
confidence: 97%