2014
DOI: 10.1016/j.neunet.2013.11.009
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection and multi-kernel learning for sparse representation on a manifold

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
9
1

Relationship

2
8

Authors

Journals

citations
Cited by 55 publications
(22 citation statements)
references
References 19 publications
0
22
0
Order By: Relevance
“…To some extent we alleviate this computational issue by using a joint feature weights learn-90 ing strategy [25], [26]. Let X = [x 1 , ..., x n ] ∈ R d×n be the training data matrix in the original feature space.…”
Section: Basic Idea and Formulationmentioning
confidence: 99%
“…To some extent we alleviate this computational issue by using a joint feature weights learn-90 ing strategy [25], [26]. Let X = [x 1 , ..., x n ] ∈ R d×n be the training data matrix in the original feature space.…”
Section: Basic Idea and Formulationmentioning
confidence: 99%
“…(4), the optimal size of selected feature subset, i.e., d ′ , cannot be automatically determined. Moreover, it is difficult to solve this typical integer programming problem due to the 0-1 constraint on the indicator vector w. We reformulate the problem (4) by relaxing this binary constraint to be within the range of [0, 1] (Liu et al, 2011;Wang, Bensmail, & Gaoa, 2014), and obtain the final formulation of Locality Preserving Score (LPS) in the following form:…”
Section: Ideas and Algorithmmentioning
confidence: 99%
“…Feature selection is an important issue in classification [30]. These methods can be divided in two main approaches: SIViP wrappers and filters [14].…”
Section: Feature Selectionmentioning
confidence: 99%