2011
DOI: 10.1117/12.913468
|View full text |Cite
|
Sign up to set email alerts
|

Recognition of faces using texture-based principal component analysis and Grassmannian distances analysis

Abstract: This paper introduces a new face recognition method-texture-based Principal Component Analysis (PCA), which employs PCA on texture features.Initially, the eigenspace of texture images is created by eigenvalues and eigenvectors.From this space,the eigentextures are constructed,and most of the eigentextures are selected by using PCA.With these eigentextures, we generalize Grassmannian distances into texture feature space to recognize.We address the problem of face recognition in terms of the subject-specific sub… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 12 publications
0
1
0
Order By: Relevance
“…Initially explored in the setting of subspace packing problems [30,5,16], the application of Stiefel and Grassmann manifolds has become widespread in computer vision and pattern recognition. Examples include: video processing arXiv:2006.14086v1 [cs.CV] 24 Jun 2020 A PREPRINT -JUNE 26, 2020 [12], classification, [11,4,33,34], action recognition [2], expression analysis [31,32,17], domain adaptation [15,28], regression [29,13], pattern recognition [18], and computation of subspace means [3,22]. More recently, Grassmannians have also been explored in the deep neural network literature [14].…”
Section: Introductionmentioning
confidence: 99%
“…Initially explored in the setting of subspace packing problems [30,5,16], the application of Stiefel and Grassmann manifolds has become widespread in computer vision and pattern recognition. Examples include: video processing arXiv:2006.14086v1 [cs.CV] 24 Jun 2020 A PREPRINT -JUNE 26, 2020 [12], classification, [11,4,33,34], action recognition [2], expression analysis [31,32,17], domain adaptation [15,28], regression [29,13], pattern recognition [18], and computation of subspace means [3,22]. More recently, Grassmannians have also been explored in the deep neural network literature [14].…”
Section: Introductionmentioning
confidence: 99%