2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2016
DOI: 10.1109/cvpr.2016.557
|View full text |Cite
|
Sign up to set email alerts
|

Kernel Sparse Subspace Clustering on Symmetric Positive Definite Manifolds

Abstract: Sparse subspace clustering (SSC), as one of the most successful subspace clustering methods, has achieved notable clustering accuracy in computer vision tasks. However, SSC applies only to vector data in Euclidean space. As such, there is still no satisfactory approach to solve subspace clustering by self-expressive principle for symmetric positive definite (SPD) matrices which is very useful in computer vision. In this paper, by embedding the SPD matrices into a Reproducing Kernel Hilbert Space (RKHS), a kern… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
40
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 90 publications
(40 citation statements)
references
References 29 publications
0
40
0
Order By: Relevance
“…Taking i ∈ P\S, j ∈ S for instance, since x j already has a label, x i should belong to the same subspace with x j as |Z ij | + |Z ji | is large enough. is can be achieved by the first part of formula (6). In this case, the index of the nonzero elements of the unknown label g i can be properly correlated to the subspace to which the data belongs.…”
Section: Mathematical Problems In Engineeringmentioning
confidence: 99%
See 1 more Smart Citation
“…Taking i ∈ P\S, j ∈ S for instance, since x j already has a label, x i should belong to the same subspace with x j as |Z ij | + |Z ji | is large enough. is can be achieved by the first part of formula (6). In this case, the index of the nonzero elements of the unknown label g i can be properly correlated to the subspace to which the data belongs.…”
Section: Mathematical Problems In Engineeringmentioning
confidence: 99%
“…Nowadays, many works proposed the assumption of linear subspace may not always be true for many real high-dimensional data. e proposed data may be better modeled by nonlinear manifolds [6][7][8].…”
Section: Introductionmentioning
confidence: 99%
“…Based on this idea, Vishal M. Patel and R. Vidal proposed the kernel sparse subspace clustering on Euclidean space in [30]. Similarly, kernel sparse subspace clustering on symmetric positive definite Riemannian manifolds has been proposed by Ming Yin in [31], where the sparse representation of data is obtained without referring to a library. However, to our knowledge, the method of kernel sparse representations of data on the Grassmann manifold without referring to a library has not been concerned up to now.…”
Section: B Sparse Representation On Grassmann Manifoldsmentioning
confidence: 99%
“…To address this problem, the kernel method is widely used recently in [14,21,22,25,26,30,31,33]. Right now, we need to find an appropriate kernel for the Grassmann manifolds.…”
Section: A Gaussian Projection Kernel For Grassmann Manifoldsmentioning
confidence: 99%
“…On the other hand, in the area of matrix information geometry, data representation by symmetric positive definite (SPD) matrix has been widely applied in many scientific researches, e.g., pattern recognition [ 12 ], image processing [ 13 ], signal processing [ 14 , 15 ] and machine learning [ 16 ]. More specifically, by constructing SPD matrices, the original information extracted from the sample data is embedded on a specific SPD manifold, which is shown to outperform the Euclidean space operation.…”
Section: Introductionmentioning
confidence: 99%