ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2019
DOI: 10.1109/icassp.2019.8683263
|View full text |Cite
|
Sign up to set email alerts
|

Robust Unsupervised Flexible Auto-weighted Local-coordinate Concept Factorization for Image Clustering

Abstract: We investigate the high-dimensional data clustering problem by proposing a novel and unsupervised representation learning model called Robust Flexible Auto-weighted Local-coordinate Concept Factorization (RFA-LCF). RFA-LCF integrates the robust flexible CF, robust sparse local-coordinate coding and the adaptive reconstruction weighting learning into a unified model. The adaptive weighting is driven by including the joint manifold preserving constraints on the recovered clean data, basis concepts and new repres… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
15
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4
2
1

Relationship

4
3

Authors

Journals

citations
Cited by 15 publications
(15 citation statements)
references
References 26 publications
0
15
0
Order By: Relevance
“…By updating the adaptive weights, robust projection and factorization matrices alternaltely, RFA-LCF can ensure the learnt weights to be optimal for the data representations. Note that an early version of this work was presented in [57]. This paper has further provided the detailed analysis of the formulation, convergence analysis, computational complexity analysis and relationship analysis, and moreover conducts a thorough experimental evaluation on the tasks of data representation and clustering.…”
Section: (3) Auto-weighted Reconstruction Graph Learningmentioning
confidence: 99%
“…By updating the adaptive weights, robust projection and factorization matrices alternaltely, RFA-LCF can ensure the learnt weights to be optimal for the data representations. Note that an early version of this work was presented in [57]. This paper has further provided the detailed analysis of the formulation, convergence analysis, computational complexity analysis and relationship analysis, and moreover conducts a thorough experimental evaluation on the tasks of data representation and clustering.…”
Section: (3) Auto-weighted Reconstruction Graph Learningmentioning
confidence: 99%
“…Although the localities can be clearly retained by these graphbased CF methods, they still have one glaring flaw, i.e., it is tough to choose an optimal number of nearest neighbors to define the neighborhood graph. To overcome this issue, researchers have recently proposed the optimized adaptive-graph based CF methods, such as CF with Adaptive Neighbors (CFANs) [12], CF with Optimal Graph Learning (CF-OGL) [126], Graph-Regularized Local coordinate CF with CLR (GRLCFCLR) [9] and Robust Flexible Autoweighted Local-Coordinate CF (RFA-LCF) [14][15], etc. These optimized weighting strategies can avoid the tricky issue of choosing the optimal number of nearest neighbors in constructing the neighborhood graph effectively.…”
Section: Introductionmentioning
confidence: 99%
“…LGCF1 [10] and RFA-LCF [14][15].  Self-representation based CF algorithms are mainly motivated by the success of exploiting the self-representation of data, in which the input data is regarded as a dictionary, which provided new idea for deriving self-expressive CF.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Representation learning from high-dimensional complex data is always an important and fundamental problem in the fields of pattern recognition and data mining [40][41][42][43][44][45][46][47][48][49][50]. To represent data, lots of feasible and effective approaches can be used, of which Matrix Factorization (MF) based models have been proven to be effective for low-dimensional feature extraction and cluster-ing [24][25][26][27][28][29][30][31][32] [36][37][38][39]. Nonnegative Matrix Factorization (NMF) [1] and Concept Factorization (CF) [2] are two most classical nonnegative MF methods.…”
Section: Introductionmentioning
confidence: 99%