The 2006 IEEE International Joint Conference on Neural Network Proceedings 2006
DOI: 10.1109/ijcnn.2006.246933
|View full text |Cite
|
Sign up to set email alerts
|

Common Subset Selection of Inputs in Multiresponse Regression

Abstract: Abstract-We propose the Multiresponse Sparse Regression algorithm, an input selection method for the purpose of estimating several response variables. It is a forward selection procedure for linearly parameterized models, which updates with carefully chosen step lengths. The step length rule extends the correlation criterion of the Least Angle Regression algorithm for many responses. We present a general concept and explicit formulas for three different variants of the algorithm. Based on experiments with simu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2007
2007
2018
2018

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(9 citation statements)
references
References 18 publications
0
9
0
Order By: Relevance
“…As R is multivariate in our setup, the most obvious application of the LARS algorithm is to feed the columns of R into the algorithm one at a time. An alternative approach is ‘multiresponse sparse regression’ (MRSR), proposed by Similä and Tikka () as an extension of LARS that allows for a multivariate response variable.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…As R is multivariate in our setup, the most obvious application of the LARS algorithm is to feed the columns of R into the algorithm one at a time. An alternative approach is ‘multiresponse sparse regression’ (MRSR), proposed by Similä and Tikka () as an extension of LARS that allows for a multivariate response variable.…”
Section: Methodsmentioning
confidence: 99%
“…Denote the fitted value of R , based on the first m regressors chosen, by trueR̂m (with trueR̂0MathClass-rel=0), and denote the j th column of X by x ( j ) . Following Similä and Tikka (), the role of correlations in the description of the univariate case above is now played by ||||()RMathClass-bin−trueR̂mMathClass-rel′x(j), where || v || represents the L 2 vector norm ()MathClass-op∑ivi21MathClass-bin∕2. No other changes to the procedure are needed.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Finally, the LARS/lasso approach can also be extended for multivariate (multiresponse) regression. Similä and Tikka () propose an extension of the LARS algorithm by modifying the correlation criterion between the predictors and the current residual (which depends on multiple outputs). Unfortunately, the exact regularization path can be recovered only when X is orthonormal.…”
Section: Adapting the Lasso To Particular Problemsmentioning
confidence: 99%