Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence 2018
DOI: 10.24963/ijcai.2018/99
|View full text |Cite
|
Sign up to set email alerts
|

Age Estimation Using Expectation of Label Distribution Learning

Abstract: Age estimation performance has been greatly improved by using convolutional neural network. However, existing methods have an inconsistency between the training objectives and evaluation metric, so they may be suboptimal. In addition, these methods always adopt image classification or face recognition models with a large amount of parameters, which bring expensive computation cost and storage overhead. To alleviate these issues, we design a lightweight network architecture and propose a unified framework which… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
95
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 124 publications
(95 citation statements)
references
References 18 publications
0
95
0
Order By: Relevance
“…In both algorithms of T6, Bidirectional Long Short-Term Memory (BLSTM) architectures were explored with slightly varied parameters, loss functions, and summation methods. For example, Alg.1 employed Label Distribution Age Encoding (LDAE) [67] as age loss function, whereas, the 2nd algorithm exploited Deep Label Distribution Learning (DLDL) [68,69] as age loss function. They implemented two different accumulation methods: for Alg.1, test predictions were accumulated with arithmetic mean, and for Alg.2, the same was done with geometric mean.…”
Section: Explored Classifiersmentioning
confidence: 99%
“…In both algorithms of T6, Bidirectional Long Short-Term Memory (BLSTM) architectures were explored with slightly varied parameters, loss functions, and summation methods. For example, Alg.1 employed Label Distribution Age Encoding (LDAE) [67] as age loss function, whereas, the 2nd algorithm exploited Deep Label Distribution Learning (DLDL) [68,69] as age loss function. They implemented two different accumulation methods: for Alg.1, test predictions were accumulated with arithmetic mean, and for Alg.2, the same was done with geometric mean.…”
Section: Explored Classifiersmentioning
confidence: 99%
“…The apparent age model was addressed as a classification problem that considers the age value as an independence category. The authors in [33] designed a lightweight CNN architecture that collectively learned age distribution and regressed it. The CNN-based approach, ThinAgeNet, employed the compression rate of 0.5.…”
Section: B Apparent Agementioning
confidence: 99%
“…In designing our CNN, we consider the model size and efficiency. The network is a modification of the CNN architecture in [36] [33].…”
Section: B Cnn Architecturementioning
confidence: 99%
“…Further, the author uses DLDL with KL testing the work on Chaleran and MORPH data set. The output gained on MAE is 5.75 [35] The backward face aging model digitally, means taking adult face and estimating the corresponding childhood face by considering face contour and its different components, face texture with the help of geometrical model. Consider the fact that face aging is non-linear process,…”
Section: Related Workmentioning
confidence: 99%