2011
DOI: 10.1109/tpami.2010.215
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection and Kernel Learning for Local Learning-Based Clustering

Abstract: Abstract-The performance of the most clustering algorithms highly relies on the representation of data in the input space or the Hilbert space of kernel methods. This paper is to obtain an appropriate data representation through feature selection or kernel learning within the framework of the Local Learning-Based Clustering (LLC) (Wu and Schö lkopf 2006) method, which can outperform the global learning-based ones when dealing with the high-dimensional data lying on manifold. Specifically, we associate a weight… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 210 publications
(17 citation statements)
references
References 30 publications
0
17
0
Order By: Relevance
“…(3) Multi-parameters from multi-network: All parameters extracted from all network layers are used as different input features to the classifier. Then for each set, various feature select methods including Correlation-based Feature Selection (CFS) (Guyon et al, 2002 ), Dependence Guided Unsupervised Feature Selection (DGUFS) (Zhu et al, 2017 ), Fisher (Gu et al, 2012 ), Feature Selective Validation (FSV) (Bradley and Mangasarian, 1999 ), Locality-Constrained Linear Coding Feature Select (LLCFS) (Zeng and Cheung, 2011 ), and minimum-redundancy maximum-relevance (mRMR) (Peng et al, 2005 ) are used to sort the features to obtain the feature sequence for each set. According to the obtained feature sequence, select the different number of features in order (i.e., the first one feature, the first two features, the first three features.)…”
Section: Resultsmentioning
confidence: 99%
“…(3) Multi-parameters from multi-network: All parameters extracted from all network layers are used as different input features to the classifier. Then for each set, various feature select methods including Correlation-based Feature Selection (CFS) (Guyon et al, 2002 ), Dependence Guided Unsupervised Feature Selection (DGUFS) (Zhu et al, 2017 ), Fisher (Gu et al, 2012 ), Feature Selective Validation (FSV) (Bradley and Mangasarian, 1999 ), Locality-Constrained Linear Coding Feature Select (LLCFS) (Zeng and Cheung, 2011 ), and minimum-redundancy maximum-relevance (mRMR) (Peng et al, 2005 ) are used to sort the features to obtain the feature sequence for each set. According to the obtained feature sequence, select the different number of features in order (i.e., the first one feature, the first two features, the first three features.)…”
Section: Resultsmentioning
confidence: 99%
“…Eight popular state-of-the-art feature selection methods were analyzed as representatives of different FS approaches, including: Chi-square test (Chi2), Infinite latent feature selection (ILFS) 32 , Relief-F (ReliefF) 33 , Laplacian score (LS) 34 , Local learning-based clustering with feature selection ILFS (LLCFS) 35 , Minimum redundancy-maximum relevance (mRMR) 36 , Random forests (RF) 37,38 , and Gradient boosting machine (GBM) 39 . Supplementary Table S3 describes their categories and computing complexity.…”
Section: Methodsmentioning
confidence: 99%
“…LLCFS [ 35 ] integrates local structure learning and feature selection into a unified framework. Specifically, LLCFS embeds weighted features into the regularization term of the local clustering learning algorithm and selects features according to their weight;…”
Section: Methodsmentioning
confidence: 99%
“…URAFS [9] embeds the local geometric structure of data into the manifold learning framework by introducing the graph regularization term based on the principle of maximum entropy into the GURM model, leading to the irrelevant features of the original data being filtered out; • UDFS [25] embeds discriminative analysis and the 2,1 -norm into the feature selection framework to select discriminative features and informative features; • SPEC [19] is a unified feature selection framework based on graph theory and is used to select relevant features by combining supervised feature selection and unsupervised feature selection; • NDFS [28] utilizes the discriminant information and correlation of features to select feature subsets. Specifically, the method combines cluster labels learned by the spectrum clustering algorithm with the feature selection matrix to finally select the most discriminant features; • LLCFS [35] integrates local structure learning and feature selection into a unified framework. Specifically, LLCFS embeds weighted features into the regularization term of the local clustering learning algorithm and selects features according to their weight; • JELSR [27] is based on an unsupervised learning structure and combines embedding learning with sparse regression to select features.…”
Section: Contrast Algorithmmentioning
confidence: 99%