2020
DOI: 10.1214/19-sts701
|View full text |Cite
|
Sign up to set email alerts
|

Sparse Regression: Scalable Algorithms and Empirical Performance

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
59
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 40 publications
(60 citation statements)
references
References 39 publications
1
59
0
Order By: Relevance
“…A future direction of study will be to develop an efficient algorithm specialized for solving our MIQO problem. We are now working on extending several MIO-based high-performance algorithms [ 24 , 48 , 49 ] to sparse Poisson regression. Another direction of future research is to improve the performance of our methods for selecting tangent lines.…”
Section: Discussionmentioning
confidence: 99%
“…A future direction of study will be to develop an efficient algorithm specialized for solving our MIQO problem. We are now working on extending several MIO-based high-performance algorithms [ 24 , 48 , 49 ] to sparse Poisson regression. Another direction of future research is to improve the performance of our methods for selecting tangent lines.…”
Section: Discussionmentioning
confidence: 99%
“…Despite its computational hardness, many practically useful algorithms have been proposed to solve 0 -regularized ERM, while the statistical properties beyond least squares regression are much less studied. We refer to [4] and [19] for two comprehensive survey articles on 0 -regularized regression methods.…”
Section: Related Literaturementioning
confidence: 99%
“…Bertsimas and Van Parys [5] developed a cutting-plane algorithm for the exact sparse regression problem and solved the problems with 100-thousand sample sizes and features within 10 minutes. In [4], the authors compared the out-ofsample accuracy and false detection rate for five methods under six noise/correlation scenarios. Based on extensive experiments, they demonstrated that Lasso performs poorly in low noise settings and is comparable with other methods as noise increases which explains the robustness aspect of Lasso.…”
Section: For a Linear Regression Problem ( ) =mentioning
confidence: 99%