2012
DOI: 10.1007/978-3-642-33786-4_26
|View full text |Cite
|
Sign up to set email alerts
|

Robust and Efficient Subspace Segmentation via Least Squares Regression

Abstract: Abstract. This paper studies the subspace segmentation problem which aims to segment data drawn from a union of multiple linear subspaces. Recent works by using sparse representation, low rank representation and their extensions attract much attention. If the subspaces from which the data drawn are independent or orthogonal, they are able to obtain a block diagonal affinity matrix, which usually leads to a correct segmentation. The main differences among them are their objective functions. We theoretically sho… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
478
0
2

Year Published

2014
2014
2022
2022

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 521 publications
(498 citation statements)
references
References 27 publications
4
478
0
2
Order By: Relevance
“…We conduct face clustering experiments on the extended Yale B and AR dataset and motion segmentation experiments on the Hopkins155 dataset. WSPQ is compared to several related subspace clustering methods, including principal component analysis (PCA), local subspace affinity (LSA), 15 spectral curvature clustering (SCC), 16 least squares regression (LSR), 46 low rank subspace clustering (LRSC), 47 Schatten-p norm regularized LRR (SPM), 40 SSC, 21 and LRR. 28 All the experiments are conducted using MATLAB ® 2012a in a laptop with Intel Core i7 4710MQ CPU and 8GRAM.…”
Section: Methodsmentioning
confidence: 99%
“…We conduct face clustering experiments on the extended Yale B and AR dataset and motion segmentation experiments on the Hopkins155 dataset. WSPQ is compared to several related subspace clustering methods, including principal component analysis (PCA), local subspace affinity (LSA), 15 spectral curvature clustering (SCC), 16 least squares regression (LSR), 46 low rank subspace clustering (LRSC), 47 Schatten-p norm regularized LRR (SPM), 40 SSC, 21 and LRR. 28 All the experiments are conducted using MATLAB ® 2012a in a laptop with Intel Core i7 4710MQ CPU and 8GRAM.…”
Section: Methodsmentioning
confidence: 99%
“…The SLRR was inspired by collaborative representation and low-rank representation techniques, which are used in classification and subspace clustering [2,38,39]. This proposed technique identifies clusters using the angular information of the principal directions of the symmetric low-rank representation, which preserves the low-rank subspace structures.…”
Section: Symmetric Low-rank Representationmentioning
confidence: 99%
“…In the absence of noise, i.e., the samples are strictly drawn from multiple subspaces, several criteria are imposed on the optimization models to learn the representation of samples as an affinity matrix for spectral clustering to solve the subspace clustering problem exactly [2,4,39,40]. For example, SSC employs the sparsest representation using an l 1 -norm regularization, while LRR seeks to learn the lowest-rank representation using a nuclear-norm regularization.…”
Section: The Symmetric Low-rank Representation Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…In [38], Zhang pointed out that the success of SRC comes from the collaborative representation of y over all training samples. The 2 -norm is supposed to take advantage of data correlation [39]. Thus, the query image y is represented by the over-complete dictionary with 2 -norm rather than 1 -norm to constrain the coding vector in CRC.…”
Section: Related Workmentioning
confidence: 99%