2005
DOI: 10.1198/004017005000000139
|View full text |Cite
|
Sign up to set email alerts
|

Simultaneous Variable Selection

Abstract: We propose a new method for selecting a common subset of explanatory variables where the aim is to model several response variables. The idea is a natural extension of the LASSO technique proposed by Tibshirani (1996) and is based on the ( joint) residual sum of squares while constraining the parameter estimates to lie within a suitable polyhedral region. The properties of the resulting convex programming problem are analyzed for the special case of an orthonormal design. For the general case, we develop an ef… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
235
0

Year Published

2007
2007
2020
2020

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 302 publications
(236 citation statements)
references
References 22 publications
1
235
0
Order By: Relevance
“…In some signal processing applications, however, the number of nonzero x components may be significant, and since these methods require at least as many pivot operations as there are nonzeros in the solution, they may be less competitive on such problems. The interior-point (IP) approach in [58], which solves a generalization of (4), also requires explicit construction of A T A, though the approach could in principle modified to allow iterative solution of the linear system at each primal-dual iteration.…”
Section: B Previous Algorithmsmentioning
confidence: 99%
“…In some signal processing applications, however, the number of nonzero x components may be significant, and since these methods require at least as many pivot operations as there are nonzeros in the solution, they may be less competitive on such problems. The interior-point (IP) approach in [58], which solves a generalization of (4), also requires explicit construction of A T A, though the approach could in principle modified to allow iterative solution of the linear system at each primal-dual iteration.…”
Section: B Previous Algorithmsmentioning
confidence: 99%
“…For example, the methodology that was developed for multivariate linear regression in Turlach et al (2005) could be extended to multinomial logit models, even though the technical details would be cumbersome. Overall, each penalty term that yields direct variable selection in multinomial logit models must penalize the parameter vectors β •j in a groupwise fashion.…”
Section: Regularization: the Categorically Structured Lassomentioning
confidence: 99%
“…as a regularization term analogous to the Group-Lasso (Yuan and Lin, 2006;Bach, 2008) and Multitask-Lasso (Turlach et al, 2005;Liu et al, 2009) with p ∈ [1, ∞]. Varoquaux et al (2010) has considered the case p = 2 while Honorio and Samaras (2010) used p = ∞.…”
Section: Learning a Set Of Ggms With Same Topological Patternsmentioning
confidence: 99%
“…Another way is to impose general and mild assumptions on the data. This kind of approach is especially common in the multi-task learning literatures (Caruana, 1997;Turlach et al, 2005), where the relationships among datasets are treated as a clue for combining multiple tasks into a single problem. The scope of the present paper is in the latter context where the relationship among datasets is the objective we want to analyze.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation