2013 IEEE International Conference on Computer Vision 2013
DOI: 10.1109/iccv.2013.35
|View full text |Cite
|
Sign up to set email alerts
|

Latent Space Sparse Subspace Clustering

Abstract: We propose a novel algorithm called Latent Space Sparse Subspace Clustering for simultaneous dimensionality reduction and clustering of data lying in a union of subspaces. Specifically, we describe a method that learns the projection of data and finds the sparse coefficients in the low-dimensional latent space. Cluster labels are then assigned by applying spectral clustering to a similarity matrix built from these sparse coefficients. An efficient optimization method is proposed and its non-linear extensions b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
95
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 181 publications
(95 citation statements)
references
References 31 publications
0
95
0
Order By: Relevance
“…We compare SCHQ with state-of-the-art subspace clustering methods, e.g., Local Subspace Analysis (LSA) [44], Spectral curvature clustering (SCC) [45], LRR [9], block-diagonal LRR (BD-LRR) [12], LSR [11], Low-Rank Subspace Clustering (LRSC) [46], SSC [7], Latent Space SSC (LS3C) [19] and Sparse Additive Subspace Clustering (SASC) [47]. We denote the semi-supervised extension of SCHQ to deal with link constraints as S-SCHQ.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…We compare SCHQ with state-of-the-art subspace clustering methods, e.g., Local Subspace Analysis (LSA) [44], Spectral curvature clustering (SCC) [45], LRR [9], block-diagonal LRR (BD-LRR) [12], LSR [11], Low-Rank Subspace Clustering (LRSC) [46], SSC [7], Latent Space SSC (LS3C) [19] and Sparse Additive Subspace Clustering (SASC) [47]. We denote the semi-supervised extension of SCHQ to deal with link constraints as S-SCHQ.…”
Section: Methodsmentioning
confidence: 99%
“…Bayesian method [14], quadratic programming [15], least squares regression [11], [16], manifold regularization [17] and Markov random walks [18] were also used respectively to improve clustering accuracy. In addition, Patel et al [19] proposed a latent space sparse subspace clustering method to simultaneously reduce dimension and segment data. Dictionary learning methods [16], [20] were adopted to learn a clean dictionary for subspace clustering.…”
Section: Introductionmentioning
confidence: 99%
“…Statistical [12] and algebraic [13], [14] approaches have also been proposed in the literature for subspace clustering. In particular, sparse representation and low-rank approximation-based methods for subspace clustering [15], [16], [11], [17], [18], [19], [20], [21], [22], [23], [24] have gained a lot of traction in recent years. These methods find a sparse or low-rank representation of the data and build a similarity graph whose weights depend on the sparse or lowrank coefficient matrix for segmenting the data.…”
Section: Introductionmentioning
confidence: 99%
“…Since the subspace model is often motivated by some underlying physical model, for instance each rigidly moving body induces a 4-dimensional subspace in motion segmentation, over-segmenting the data can be difficult to correct. On the other hand, because most algorithms of this class (Elhamifar and Vidal 2013;Patel, Nguyen, and Vidal 2013;Li and Vidal 2015;Liu, Lin, and Yu 2010) require the number of clusters to be known in advance, cluster centers will be incorrect which causes the labelling to be wrong. It is therefore more desirable to address the original problem in the first place.…”
Section: Introductionmentioning
confidence: 99%
“…An extension of SSC jointly estimates the parameters of a global subspace which includes all the data (Patel, Nguyen, and Vidal 2013). In a recent work, both steps of sparse optimization and spectral clustering were combined into a single, iterative algorithm (Li and Vidal 2015).…”
Section: Introductionmentioning
confidence: 99%