2022
DOI: 10.1109/access.2022.3168310
|View full text |Cite
|
Sign up to set email alerts
|

Angular Margin-Mining Softmax Loss for Face Recognition

Abstract: Face recognition methods have been significantly improved in recent years owing to the advances made in loss functions. Typically, loss functions are designed to enhance the separability power by concentrating on hard samples in mining-based approaches or by increasing the feature margin between different classes in margin-based approaches. However, margin-based methods lack the utilization of informative hard sample, and mining-based methods also fail to learn the latent correlations between classes. Moreover… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…The center loss [7] simultaneously learns a center for deep features of each class and penalizes the distances between the deep features and their corresponding class centers. However, the center loss faces the drawbacks of high computational cost and sensitivity to the sample selecting strategy, which significantly affects the performance [18]. A-Softmax loss [19] tries to incorporate the angular margin into the softmax loss function, which has achieved promising performance.…”
Section: -Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…The center loss [7] simultaneously learns a center for deep features of each class and penalizes the distances between the deep features and their corresponding class centers. However, the center loss faces the drawbacks of high computational cost and sensitivity to the sample selecting strategy, which significantly affects the performance [18]. A-Softmax loss [19] tries to incorporate the angular margin into the softmax loss function, which has achieved promising performance.…”
Section: -Related Workmentioning
confidence: 99%
“…Due to the values of weights and features being unbounded, samples in different classes still can overlap. AMM-Softmax [18] design a linear angular margin for hard samples, which allows the optimization of the geodesic distance margin and maximization of class separability. According to the article the consistency of the margin provides stability and acceleration in the training too [18].…”
Section: -Related Workmentioning
confidence: 99%
See 1 more Smart Citation