2014 International Joint Conference on Neural Networks (IJCNN) 2014
DOI: 10.1109/ijcnn.2014.6889472
|View full text |Cite
|
Sign up to set email alerts
|

Tensor LRR based subspace clustering

Abstract: Abstract-Subspace clustering groups a set of samples (vectors) into clusters by approximating this set with a mixture of several linear subspaces, so that the samples in the same cluster are drawn from the same linear subspace. In majority of existing works on subspace clustering, samples are simply regarded as being independent and identically distributed, that is, arbitrarily ordering samples when necessary. However, this setting ignores sample correlations in their original spatial structure. To address thi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
17
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(17 citation statements)
references
References 26 publications
0
17
0
Order By: Relevance
“…The measurements can be highly quantized due to the sensing process and the objective is to recover the data [49,50]. A similar idea also applies to low-quality image recovery [15,2]. Images from the same subject can be represented by {rows of an image × columns of an image × different images}.…”
Section: Notation and Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…The measurements can be highly quantized due to the sensing process and the objective is to recover the data [49,50]. A similar idea also applies to low-quality image recovery [15,2]. Images from the same subject can be represented by {rows of an image × columns of an image × different images}.…”
Section: Notation and Preliminariesmentioning
confidence: 99%
“…Practical datasets may contain additional correlations that cannot be captured by low-rank matrices. For instance, if every frame of a video is vectorized so that the video is represented by a matrix, the spatial correlation is not directly characterized by low-rank matrices [15]. In recommendation systems, users' ratings against objects vary under different contexts [16], [1] We use the notations g = O(n), g = Θ(n) if as n goes to infinity, g ≤ c • n, c1 • n ≤ g ≤ c2 • n eventually holds for some positive constants c, c1 and c2 respectively.…”
Section: Introductionmentioning
confidence: 99%
“…That is, TLRR seeks optimal low-rank solutions U n (1 ≤ n < N) of the structured data X using itself as a dictionary [50]. Remark 1: There is a clear link between the LRR and the TLRR in (8).…”
Section: Subspace Clustering Via Tensor Low-rank Representation Amentioning
confidence: 99%
“…Thus, our model (11) aims to find the lowest rank representations along all the spatial modes, and learn a dictionary with its sparse representation over the samples on the feature mode at the same time. Unlike the TLRR model in [50] merely considers multiple space information, our model takes both spatial and feature information into consideration. The advantage of our model over the TLRR model is shown in Fig.…”
Section: Tensor Spatial Low-rank Representation and Feature Sparsementioning
confidence: 99%
See 1 more Smart Citation