2016
DOI: 10.1080/01621459.2015.1005839
|View full text |Cite
|
Sign up to set email alerts
|

Spline-Lasso in High-Dimensional Linear Regression

Abstract: We consider a high-dimensional linear regression problem, where the covariates (features) are ordered in some meaningful way, and the number of covariates p can be much larger than the sample size n. The fused lasso of Tibshirani et al. (2005) is designed especially to tackle this type of problems; it yields sparse coefficients and selects grouped variables, and encourages local constant coefficient profile within each group. However, in some applications, the effects of different features within a group might… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 30 publications
(31 citation statements)
references
References 22 publications
0
31
0
Order By: Relevance
“…Guo et al . () shows that the spline type penalty can outperform these alternatives. Another advantage of the spline type penalty is that the quadratic form is computationally more manageable than, for example, the absolute‐value‐based.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Guo et al . () shows that the spline type penalty can outperform these alternatives. Another advantage of the spline type penalty is that the quadratic form is computationally more manageable than, for example, the absolute‐value‐based.…”
Section: Methodsmentioning
confidence: 99%
“…With main G effects, some alternatives, such as the fused lasso and smooth lasso, promote first-order smoothness, while this penalty promotes second-order smoothness. Guo et al (2016) shows that the spline type penalty can outperform these alternatives. Another advantage of the spline type penalty is that the quadratic form is computationally more manageable than, for example, the absolute-value-based.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Hebiri and Van introduced “smooth‐lasso,” where the smoothness between (βj,1,,βj,qj) were encouraged by L 2 ‐penalty m=2qj(βj,m+1βj,m)2. Guo et al suggested m=2qj1(Δm(2)βj)2, where Δ m β j =:(β j , m +1 −β j , m ), and Δm(2)βj=:(ΔmβjΔm1βj)=βj,m+12βj,m+βj,m1. Motivating from the ideas of these two methods to capture the smooth features in a group, we propose two types of penalties to capture smoothing changes in each group, which are named the group spline‐penalty and group smooth‐penalty , respectively.…”
Section: Methodsmentioning
confidence: 99%
“…Hebiri and Van 11 introduced "smooth-lasso," where the smoothness between ( j,1 , … , j,q j ) were encouraged by L 2 -penalty ∑q j m=2 ( j,m+1 − j,m ) 2 . Guo et al 13 suggested…”
Section: The Model Setting and Methodologymentioning
confidence: 99%