2015
DOI: 10.1016/j.patcog.2014.10.021
|View full text |Cite
|
Sign up to set email alerts
|

Sparse discriminative feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
11
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 50 publications
(11 citation statements)
references
References 21 publications
0
11
0
Order By: Relevance
“…In the practical IC problems, some normalization steps, such as mean normalization [31] or norm normalization [23] , always conducted in advance. These steps can regulate the distribution of input data and enhance the numerical stability of classifiers, but the consequence has not been analyzed in theory and empirically.…”
Section: The Norm Normalization Problemmentioning
confidence: 99%
“…In the practical IC problems, some normalization steps, such as mean normalization [31] or norm normalization [23] , always conducted in advance. These steps can regulate the distribution of input data and enhance the numerical stability of classifiers, but the consequence has not been analyzed in theory and empirically.…”
Section: The Norm Normalization Problemmentioning
confidence: 99%
“…On the other hand, sparse feature selection methods have received increasing attention [29]. By formulating feature selection as a regression model with an ordinary least square (OLS) term and a specifically designed sparsity inducing regularizer, the regression model can be efficiently represented by a linear combination of a set of the most active variables.…”
Section: Introductionmentioning
confidence: 99%
“…Although these methods can rank the features by the weights, they are maybe less effective and efficient because the importance of each feature is evaluated individually and global optimality is neglected. Therefore, sparsity regularization has been introduced into feature selection and received an increasing amount of interest in recent years . Different from traditional feature searching strategies, this type of methods based on sparsity regularization selects the best feature subset in a batch mode so that the interaction and dependency between different features are considered at the same time.…”
Section: Introductionmentioning
confidence: 99%