2015
DOI: 10.1109/tip.2015.2456504
|View full text |Cite
|
Sign up to set email alerts
|

Robust Subspace Clustering With Complex Noise

Abstract: Subspace clustering has important and wide applications in computer vision and pattern recognition. It is a challenging task to learn low-dimensional subspace structures due to complex noise existing in high-dimensional data. Complex noise has much more complex statistical structures, and is neither Gaussian nor Laplacian noise. Recent subspace clustering methods usually assume a sparse representation of the errors incurred by noise and correct these errors iteratively. However, large corruptions incurred by c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 45 publications
(3 citation statements)
references
References 47 publications
0
3
0
Order By: Relevance
“…In fact, nonconvex loss functions such as Welsch and Cauchy, are widely utilized to achieve robust performance [42]- [44], because the convex ℓ 1 -norm and Huber function are still sensitive to outliers with large magnitude. Among these nonconvex functions, Welsch function as an error measure has attained big success in robust principal component analysis (RPCA) [45], robust matrix completion (RMC) [19] and subspace clustering [46]. Nevertheless, Welsch function has two limitations: (i) The first issue is stated by comparing the Welsch function with its Huber counterpart.…”
Section: Introductionmentioning
confidence: 99%
“…In fact, nonconvex loss functions such as Welsch and Cauchy, are widely utilized to achieve robust performance [42]- [44], because the convex ℓ 1 -norm and Huber function are still sensitive to outliers with large magnitude. Among these nonconvex functions, Welsch function as an error measure has attained big success in robust principal component analysis (RPCA) [45], robust matrix completion (RMC) [19] and subspace clustering [46]. Nevertheless, Welsch function has two limitations: (i) The first issue is stated by comparing the Welsch function with its Huber counterpart.…”
Section: Introductionmentioning
confidence: 99%
“…They defined an energy function to measure a block in the projected space, and found the correct block with the maximal energy to lead clustering. He et al [38] proposed the subspace clustering algorithm via half-quadratic (SCHQ) to handle noisy data. SCHQ consisted of two parts, the first one adopted l 1 norm to obtain a sparse representation of data, and the second one maximized the correlation of low-dimensional representations among a given data point and other data points to reduce the damage of noisy data to coefficient representation.…”
Section: Introductionmentioning
confidence: 99%
“…Subspace clustering initiates the task of finding a subspace belonging to the low-dimensional class in which the data from the individual groups can simultaneously accommodate its subspace structure. The classifications of subspace clustering methods are algebraic methodologies, iterative methods, spectral clustering derived methods and statistical technique of clustering [9]. An extension to the conventional method of clustering is the subspace clustering technique.…”
Section: Introductionmentioning
confidence: 99%