2017
DOI: 10.1109/tip.2017.2716180
|View full text |Cite
|
Sign up to set email alerts
|

Robust Face Recognition With Kernelized Locality-Sensitive Group Sparsity Representation

Abstract: Abstract-In this paper, a novel joint sparse representation method is proposed for robust face recognition. We embed both group sparsity and kernelized locality-sensitive constraints into the framework of sparse representation. The group sparsity constraint is designed to utilize the grouped structure information in the training data. The local similarity between test and training data is measured in the kernel space instead of the Euclidian space. As a result, the embedded nonlinear information can be effecti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
20
0
1

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(21 citation statements)
references
References 38 publications
0
20
0
1
Order By: Relevance
“…However, the authors did not provide the information about the occlusion and the maximum angle for pose variation investigation in their paper. In [16], authors describe a Kernelized Locality-sensitive Group Sparsity Representation (KLS-GSRC) method for face recognition. Extensive experiments were conducted on advanced databases, such as the LFW, the ORL, Extended Yale B, and the AR dataset.…”
Section: Related Workmentioning
confidence: 99%
“…However, the authors did not provide the information about the occlusion and the maximum angle for pose variation investigation in their paper. In [16], authors describe a Kernelized Locality-sensitive Group Sparsity Representation (KLS-GSRC) method for face recognition. Extensive experiments were conducted on advanced databases, such as the LFW, the ORL, Extended Yale B, and the AR dataset.…”
Section: Related Workmentioning
confidence: 99%
“…) . The ADMM iterates satisfy that With the fact that the closedness and properness of our cost function (13) are clear, we further need to make sure that all the subproblems be convex and have convergent solutions. It is evident that subproblem (17) is strictly convex and with closed-form solution (18) , the remaining issue for us is to prove the convexity and convergence of subproblem (19) .…”
Section: Proposition 2 ( [38]mentioning
confidence: 99%
“…Both the theoretical analysis and the experimental results have showed the promising performance of GSC, outperforming both SRC and CRC. More recently, it has been verified that the property of locality preservation is more important for a classifier [13,14] . As a result, many regression-based works have been proposed, such as integrating the data locality into the constraints of l 1 -norm [15,16] , l 2 -norm [17,18] or group norm [19,20] , for improvement.…”
Section: Introductionmentioning
confidence: 98%
See 1 more Smart Citation
“…Jiang et al [17] proposed dictionary decomposition based on a sparse and dense hybrid representation method to solve the problem of corrupted training data and insufficient representative sample for each class. Tan et al [18] proposed group sparsity and kernelised locality‐sensitive method, and the local similarity is measured instead of the Euclidian distance. To reduce the feature vectors, the frequency components of face are extracted by the transformation of discrete cosine transform (DCT) [19–23] and Fourier [24].…”
Section: Introductionmentioning
confidence: 99%