2013
DOI: 10.1007/978-3-642-37456-2_37
|View full text |Cite
|
Sign up to set email alerts
|

Low-Rank Matrix Recovery with Discriminant Regularization

Abstract: Abstract. Recently, image classification has been an active research topic due to the urgent need to retrieve and browse digital images via semantic keywords. Based on the success of low-rank matrix recovery which has been applied to statistical learning, computer vision and signal processing, this paper presents a novel low-rank matrix recovery algorithm with discriminant regularization. Standard low-rank matrix recovery algorithm decomposes the original dataset into a set of representative basis with a corre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
8
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 27 publications
0
8
0
Order By: Relevance
“…This method differs from our approach in two aspects. First, RPCA used in [43] can only model one single subspace, whereas our approach is able to discover multiple subspaces by virtue of LRR, which fits well for multiclass classification problems. Second, the method in [43] separately learns low-rank data representation and subspace, which means the obtained subspace cannot be guaranteed to be optimal, whereas our approach iteratively learns LRRs and discriminative subspaces.…”
Section: B Low-rank Modelingmentioning
confidence: 98%
See 1 more Smart Citation
“…This method differs from our approach in two aspects. First, RPCA used in [43] can only model one single subspace, whereas our approach is able to discover multiple subspaces by virtue of LRR, which fits well for multiclass classification problems. Second, the method in [43] separately learns low-rank data representation and subspace, which means the obtained subspace cannot be guaranteed to be optimal, whereas our approach iteratively learns LRRs and discriminative subspaces.…”
Section: B Low-rank Modelingmentioning
confidence: 98%
“…In [43], a discriminant regularization term is incorporated into the formulation of RPCA. This method differs from our approach in two aspects.…”
Section: B Low-rank Modelingmentioning
confidence: 99%
“…Based on the results of sparse manifold adaption in Section 4.2, we set (N i , γ ) as a near optimal combination (20, 1) for all data sets. As reported in Zhuang et al (2012) and Zheng, Zhang, Jia et al (2013), LRR variants are insensitive to the variation of λ provided that it is given a relatively large value (usually 10). Thus it is reasonable to set λ as a fixed value to alleviate the burden of parameter tuning, which means that the level of corruption in data could be fixed.…”
Section: Comparing With Lrr Variantsmentioning
confidence: 83%
“…GLRR model employs accelerated gradient method (Ji & Ye, 2009) to update J, which is the auxiliary variable w.r.t. Z; while in our experiments, we relax the GLRR objective function as described in Zheng, Zhang, Jia et al (2013) to solve J by using the SVT operator .…”
Section: Comparing With Lrr Variantsmentioning
confidence: 99%
See 1 more Smart Citation