2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00850
|View full text |Cite
|
Sign up to set email alerts
|

AM-LFS: AutoML for Loss Function Search

Abstract: Designing an effective loss function plays an important role in visual analysis. Most existing loss function designs rely on hand-crafted heuristics that require domain experts to explore the large design space, which is usually suboptimal and time-consuming. In this paper, we propose Au-toML for Loss Function Search (AM-LFS) which leverages REINFORCE to search loss functions during the training process. The key contribution of this work is the design of search space which can guarantee the generalization and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
40
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 63 publications
(41 citation statements)
references
References 42 publications
(78 reference statements)
0
40
0
1
Order By: Relevance
“…It is almost impossible to explore specific Gaussian distributions for various landmarks and tasks with hand-crafted heuristics, especially when the number of landmarks increases. Inspired by the work [30], we can consider the variances or standard deviations of the Gaussian distributions as the network hyperparameters to determine the distance metric between the predicted values and the ground truth. We propose a learning-to-learn framework equipping U-Net with an RL-based approach, which can steadily optimize the network parameters and the Gaussian standard deviations of the heatmap ground truth simultaneously.…”
Section: A Landmark-aware Objective Metric Learningmentioning
confidence: 99%
“…It is almost impossible to explore specific Gaussian distributions for various landmarks and tasks with hand-crafted heuristics, especially when the number of landmarks increases. Inspired by the work [30], we can consider the variances or standard deviations of the Gaussian distributions as the network hyperparameters to determine the distance metric between the predicted values and the ground truth. We propose a learning-to-learn framework equipping U-Net with an RL-based approach, which can steadily optimize the network parameters and the Gaussian standard deviations of the heatmap ground truth simultaneously.…”
Section: A Landmark-aware Objective Metric Learningmentioning
confidence: 99%
“…Moreover, some auxiliary loss [25,26] are employed to train models together with classification loss functions. Recently, some approaches focus on hard examples [27] or AutoML searching [28,29] to get better loss functions. Though these loss functions achieve better and better performance on FR, but they all depend on fulllabeled identity datasets, such as MS-Celeb-1M and Casia, and cannot treat face sequences as the training data to learn discriminative face features.…”
Section: Deep Face Recognitionmentioning
confidence: 99%
“…Previous methods relay on hand-crafted heuristics that require much effort from experts to explore the large design space. To address this issue, Li et al (Li et al, 2019) propose a new AutoML for Loss Function Search (AM-LFS) to automatically determine the search space. Specifically, the formulation of AM-LFS is written as follows:…”
Section: Preliminary Knowledgementioning
confidence: 99%
“…Recently, Li et al (Li et al, 2019) propose an AutoML for loss function search method (AM-LFS) from a hyperparameter optimization perspective. Specifically, they formulate hyper-parameters of loss functions as a parameterized probability distribution sampling and achieve promising results on several different vision tasks.…”
Section: Introductionmentioning
confidence: 99%