2018
DOI: 10.1007/s11749-018-0584-4
|View full text |Cite
|
Sign up to set email alerts
|

Prediction error bounds for linear regression with the TREX

Abstract: The TREX is a recently introduced approach to sparse linear regression. In contrast to most well-known approaches to penalized regression, the TREX can be formulated without the use of tuning parameters. In this paper, we establish the first known prediction error bounds for the TREX. Additionally, we introduce extensions of the TREX to a more general class of penalties, and we provide a bound on the prediction error in this generalized setting. These results deepen the understanding of TREX from a theoretical… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
9
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
2
2

Relationship

3
6

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 46 publications
1
9
0
Order By: Relevance
“…This bound parallels the one for the trex for 1 -regularized linear regression (Bien et al 2018b, Theorem 2). Moreover, it relates to Theorem 3.1; in particular, if λ ≤ λ * , then the t-ridge bound equals the edr bound at the optimal tuning parameter λ * -without the t-ridge knowing that tuning parameter.…”
Section: Further Theoretical Insightssupporting
confidence: 70%
“…This bound parallels the one for the trex for 1 -regularized linear regression (Bien et al 2018b, Theorem 2). Moreover, it relates to Theorem 3.1; in particular, if λ ≤ λ * , then the t-ridge bound equals the edr bound at the optimal tuning parameter λ * -without the t-ridge knowing that tuning parameter.…”
Section: Further Theoretical Insightssupporting
confidence: 70%
“…Bien et al [2016] showed the remarkable result that despite the nonconvexity, there exists a polynomial-time algorithm that is guaranteed to find the global minimum of the TREX problem. Bien et al [2018] recently established a prediction error bound for TREX, which deepens the understanding of the theoretical properties of TREX.…”
Section: Trexmentioning
confidence: 99%
“…In contrast, we derive both fast-rate and slow-rate bounds. The main advantage of the slow-rate bounds is that they do not require either sparsity assumption or the restricted eigenvalue condition, we refer to [2,9] for the discussion of the two types of bounds. As part of the slow-rate bound derivation, we demonstrate that the norm of B can 2 always be bounded by a constant times the norm of B * .…”
Section: Relations To Existing Literaturementioning
confidence: 99%
“…Proof of Lemma 1. Let f = f (Θ, B) denote the objective function in (2) and let (Θ * , B * ) be the global solution to (2), that is…”
Section: A1 Technical Proofsmentioning
confidence: 99%