2008 47th IEEE Conference on Decision and Control 2008
DOI: 10.1109/cdc.2008.4738836
|View full text |Cite
|
Sign up to set email alerts
|

The use of nonnegative garrote for order selection of ARX models

Abstract: Abstract-Order selection of linear regression models has been thoroughly researched in the statistical community for some time. Different shrinkage methods have been proposed, such as the Ridge and Lasso regression methods. Especially the Lasso regression has won fame because of its ability to set less important parameters exactly to zero.However, these methods do not take dynamical systems into account, where the regressors are ordered via the time lag. To this end, a modified variant of the nonnegative garro… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2009
2009
2014
2014

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(14 citation statements)
references
References 10 publications
0
14
0
Order By: Relevance
“…In dynamical linear models, the parameters are naturally ordered by their memory length [11]. In the nonlinear case, the parameters are not only ordered by memory length, but also by their nonlinearity order.…”
Section: B Modification On Original Nng Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In dynamical linear models, the parameters are naturally ordered by their memory length [11]. In the nonlinear case, the parameters are not only ordered by memory length, but also by their nonlinearity order.…”
Section: B Modification On Original Nng Methodsmentioning
confidence: 99%
“…At first it is considered as a coefficient shrinkage method for linear regression models in statistics. Recently this method is modified and used for order selection of ARX (AutoRegressive with eXogenous input) models in [11]. The NNG method penalizes the model parameters by attaching weights to it, which in turn are regularized.…”
Section: Modified Nonnegative Garrote Methods a The Nonnegative mentioning
confidence: 99%
See 1 more Smart Citation
“…The fit itself can be calculated in terms of any error measure or the BIC or AIC criterion. An efficient way to implement this strategy is to use a path following parametric estimation, which calculates a piecewise affine solution path for λ [9]. The NNG is reported to be more effective in recovering the sparsity structure of θ o than the LASSO.…”
Section: B Sparse Estimators: Lasso and Nngmentioning
confidence: 99%
“…More recently, statistical regularization (shrinkage) methods have been developed like the Non-Negative Garrote (NNG) or the Least Absolute Shrinkage and Selection Operator (LASSO) [5]- [7], or the Ridge Regression and the Elastic Net methods [8]. The NNG was applied in the context of identification of Linear Time-Invariant (LTI) Auto Regressive with eXogenous input (ARX) models in [9].…”
Section: Introductionmentioning
confidence: 99%