2018
DOI: 10.1007/978-3-030-01246-5_5
|View full text |Cite
|
Sign up to set email alerts
|

Learning with Biased Complementary Labels

Abstract: In this paper, we study the classi cation problem in which we have access to easily obtainable surrogate for true labels, namely complementary labels, which specify classes that observations do not belong to. Let Y andȲ be the true and complementary labels, respectively. We rst model the annotation of complementary labels via transition probabilities P (Ȳ = i|Y = j), i = j ∈ {1, · · · , c}, where c is the number of classes. Previous methods implicitly assume that P (Ȳ = i|Y = j), ∀i = j, are identical, which i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
171
0
3

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 147 publications
(174 citation statements)
references
References 24 publications
0
171
0
3
Order By: Relevance
“…Existing Discriminative Complementary Learning (DCL) methods modified the ordinary classification loss function to the complementary classification loss¯ to provide a consistent estimation of f . Various loss functions have been considered in the literature, such as one-vs-all ramp/sigmoid loss (Ishida et al 2017), pair-comparison ramp/sigmoid loss (Ishida et al 2017), and cross-entropy loss (Yu et al 2018).…”
Section: Discriminative Complementary Learningmentioning
confidence: 99%
See 3 more Smart Citations
“…Existing Discriminative Complementary Learning (DCL) methods modified the ordinary classification loss function to the complementary classification loss¯ to provide a consistent estimation of f . Various loss functions have been considered in the literature, such as one-vs-all ramp/sigmoid loss (Ishida et al 2017), pair-comparison ramp/sigmoid loss (Ishida et al 2017), and cross-entropy loss (Yu et al 2018).…”
Section: Discriminative Complementary Learningmentioning
confidence: 99%
“…In this paper, we consider a recently proposed weaklysupervised classification scenario, i.e., learning from complementary labels (Ishida et al 2017;Yu et al 2018). Unlike an ordinary label, a complementary label specifies a class that an input instance does not belong to.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Liu et al [39] use importance reweighting to modify any loss function for classification with noisy labels, and extend the random classification noise to a bounded case [40]. Moreover, Yu et al [41] prove that biased complementary labels can enhance multi-class classification. These methods have been successfully applied in binary classification and multi-class classification, which may lead to improvements when extended to our large-scale image annotation where each image is annotated with one or more tags and labels (but out of the scope of this paper).…”
Section: Image Annotation Using Side Informationmentioning
confidence: 99%