2016
DOI: 10.1609/aaai.v30i1.10317
|View full text |Cite
|
Sign up to set email alerts
|

Uncorrelated Group LASSO

Abstract: l2,1-norm is an effective regularization to enforce a simple group sparsity for feature learning. To capture some subtle structures among feature groups, we propose a new regularization called exclusive group l2,1-norm. It enforces the sparsity at the intra-group level by using l2,1-norm, while encourages the selected features to distribute in different groups by using l2 norm at the inter-group level. The proposed exclusivegroup l2,1-norm is capable of eliminating the feature correlationsin the context of fea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…Since the binarization operation for low-dimensional features RP ๐‘‡ ๐‘— c ๐‘— and binary codes b ๐‘— helps increase the information loss, we consider using the feature sparsity for personalized weight P ๐‘— to select important features. Exclusive group lasso (EGL) [18,19] encourages intra-cluster competition but discourages inter-cluster competition. Inspired by EGL we first impose a ๐‘™ 2,1 -norm regularization on personalized weight P ๐‘— for pursuing the sparsity of intra-cluster features.…”
Section: Personalized Sparse Hashingmentioning
confidence: 99%
“…Since the binarization operation for low-dimensional features RP ๐‘‡ ๐‘— c ๐‘— and binary codes b ๐‘— helps increase the information loss, we consider using the feature sparsity for personalized weight P ๐‘— to select important features. Exclusive group lasso (EGL) [18,19] encourages intra-cluster competition but discourages inter-cluster competition. Inspired by EGL we first impose a ๐‘™ 2,1 -norm regularization on personalized weight P ๐‘— for pursuing the sparsity of intra-cluster features.…”
Section: Personalized Sparse Hashingmentioning
confidence: 99%