2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition 2010
DOI: 10.1109/cvpr.2010.5539969
|View full text |Cite
|
Sign up to set email alerts
|

Bimodal gender recognition from face and fingerprint

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
16
0
2

Year Published

2011
2011
2020
2020

Publication Types

Select...
4
2
2

Relationship

3
5

Authors

Journals

citations
Cited by 24 publications
(18 citation statements)
references
References 21 publications
0
16
0
2
Order By: Relevance
“…There are other attempts to integrate generative and discriminative models. For example, [8,17,10] use generative models as priors over discriminative models and [24,12] learn generative models with the help of discriminative constraints. These methods are theoretically distinct from our method (as well as FESS and FK/TK methods).…”
Section: Discussionmentioning
confidence: 99%
“…There are other attempts to integrate generative and discriminative models. For example, [8,17,10] use generative models as priors over discriminative models and [24,12] learn generative models with the help of discriminative constraints. These methods are theoretically distinct from our method (as well as FESS and FK/TK methods).…”
Section: Discussionmentioning
confidence: 99%
“…There have been some previous attempts to fuse multiple biometrics such as combining the face, gait, fingerprint, palm print and geometry for gender recognition [29][30][31]. Table 2 shows more related works for the fusion of physiological biometrics and the gap of this study.…”
Section: Gender Recognition Using Physiological Signalmentioning
confidence: 99%
“…Deceit detection, a less popular computation perception problem [2] also focuses on changes in cues in the face where specialized sets of features and datasets are dedicated to solving this problem. Age [3], [4] and gender [5] estimation similarly apply a myriad of techniques, but on a set of specialized features learned and tested from a task-dedicated dataset.…”
Section: Introductionmentioning
confidence: 99%