2016
DOI: 10.1109/tnnls.2015.2464090
|View full text |Cite
|
Sign up to set email alerts
|

Learning Robust and Discriminative Subspace With Low-Rank Constraints

Abstract: Abstract-In this paper, we aim at learning robust and discriminative subspaces from noisy data. Subspace learning is widely used in extracting discriminative features for classification. However, when data are contaminated with severe noise, the performance of most existing subspace learning methods would be limited. Recent advances in low-rank modeling provide effective solutions for removing noise or outliers contained in sample sets, which motivates us to take advantage of low-rank constraints in order to e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

1
60
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 127 publications
(61 citation statements)
references
References 47 publications
1
60
0
Order By: Relevance
“…Problem (1) is hard to optimize due to the discrete nature of the rank function [37]. A common practice to handle this problem is to replace rank function with nuclear norm [36], [38], and then Eq. (1) can be substituted with the following convex optimization [39], [40]:…”
Section: B Low-rank Multi-view Subspace Learning Based Approachesmentioning
confidence: 99%
“…Problem (1) is hard to optimize due to the discrete nature of the rank function [37]. A common practice to handle this problem is to replace rank function with nuclear norm [36], [38], and then Eq. (1) can be substituted with the following convex optimization [39], [40]:…”
Section: B Low-rank Multi-view Subspace Learning Based Approachesmentioning
confidence: 99%
“…discrete targets, and then exploit the projection matrix to make image classification or regression [3], [4]. In addition, discriminative methods can achieve impressive performance when constructing robust projection matrix and providing sufficient training samples [5], [6].…”
mentioning
confidence: 99%
“…Latent low-rank representation (LatLRR) [26] explores the unobserved hidden information of data, and can robustly extract salient features from noise or corrupted data. Subsequently, many variations of the low-rank minimization have been applied to solve different problems [3], [27]- [29]. For example, Li and Fu [3] proposed a supervised regularization-based robust subspace learning method by jointly removing noise term with low-rank constraint and learning a discriminate subspace from the clean data.…”
mentioning
confidence: 99%
See 1 more Smart Citation
“…To evaluate the scalability of our method and competing methods, we separately utilize samples of 20, 40, 60, 80 and 100 objects in this dataset. Several subspace methods (NPE [55], LSDA [56] and SRRS [60]) and latest DL based classification methods, i.e. [59], LCLRD [29] are compared in this dataset.…”
Section: Classificationmentioning
confidence: 99%