2004 2nd International IEEE Conference on 'Intelligent Systems'. Proceedings (IEEE Cat. No.04EX791)
DOI: 10.1109/is.2004.1344642
|View full text |Cite
|
Sign up to set email alerts
|

Dimensionality reduction of face images for gender classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
1

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(19 citation statements)
references
References 11 publications
0
18
1
Order By: Relevance
“…In Villegas and Paredes (2011) the authors show a good comparison of different methods on a gender recognition problem among others. These techniques have been widely used because of their simplicity and effectiveness (Buchala et al (2004); Graf and Wichmann (2002)). However, they might not capture relevant information to represent a face in the gender recognition problem.…”
Section: Related Workmentioning
confidence: 99%
“…In Villegas and Paredes (2011) the authors show a good comparison of different methods on a gender recognition problem among others. These techniques have been widely used because of their simplicity and effectiveness (Buchala et al (2004); Graf and Wichmann (2002)). However, they might not capture relevant information to represent a face in the gender recognition problem.…”
Section: Related Workmentioning
confidence: 99%
“…Although there are several works on gender recognition of human face images [3] [8], there is no standard database or protocol for experimentation in this task. [6].…”
Section: Datasetmentioning
confidence: 99%
“…When class information was available, classical techniques focused on obtaining discriminative features (discriminant analysis) as well as a reduction of dimensionality. All these techniques have been widely used because of their simplicity and effectiveness [3].…”
Section: Introductionmentioning
confidence: 99%
“…In fact, an accuracy as good as 96% can be achieved with the hair concealed, facial hair removed and no makeup [6]. In recent years, a lot of effort has been spent on the statistical features based [4], [5], [16], [15], [10] approaches for gender classification. Most of them are based on the 2D intensity information.…”
Section: Introductionmentioning
confidence: 99%