2021
DOI: 10.48550/arxiv.2103.14026
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

AutoLoss-Zero: Searching Loss Functions from Scratch for Generic Tasks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(11 citation statements)
references
References 41 publications
0
11
0
Order By: Relevance
“…Compared with these handcrafted losses, Parameterized AP Loss can yield 1.5 ∼ 3.0 AP score gain. We also compared with AutoLoss-Zero [19] and CSE AutoLoss [28], which are two AutoML-based losses. Our approach can obtain over 1.5 AP score improvement over them.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Compared with these handcrafted losses, Parameterized AP Loss can yield 1.5 ∼ 3.0 AP score gain. We also compared with AutoLoss-Zero [19] and CSE AutoLoss [28], which are two AutoML-based losses. Our approach can obtain over 1.5 AP score improvement over them.…”
Section: Resultsmentioning
confidence: 99%
“…Searching Loss Functions for Object Detection. Recent works [28,19] have also tried to search suitable loss functions for object detection. In these methods, loss functions are formulated as computational graphs composed of basic mathematical operators.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Subsequently, Li et al [42] and Liu et al [43] introduced this idea into semantic segmentation and object detection, respectively. More recently, Li et al [44] proposed AutoLoss-Zero, a general framework for learning tasks. They verified the effectiveness of AutoLoss-Zero on four tasks, i.e., semantic segmentation, object detection, instance segmentation, and pose estimation.…”
Section: Auto Loss Function Searchmentioning
confidence: 99%
“…To solve the above problem, each loss function is expressed as a tree, and the genetic programming (GP) [14] is adopted for optimizing a suitable solution among the search space. Our method is similar to CSE-AutoLoss [43] and AutoLoss-Zero [44], and is dubbed as AutoLoss-AR, which is the abbreviation of auto loss function search for adversarial risk. The major difference is that AutoLoss-AR focuses on searching for a suitable loss for tightening the approximation error of adversarial risk, rather than training a model for a specific task with a better performance.…”
Section: A Overviewmentioning
confidence: 99%