2021
DOI: 10.48550/arxiv.2106.08864
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multi-Class Classification from Single-Class Data with Confidences

Yuzhou Cao,
Lei Feng,
Senlin Shu
et al.

Abstract: Can we learn a multi-class classifier from only data of a single class? We show that without any assumptions on the loss functions, models, and optimizers, we can successfully learn a multi-class classifier from only data of a single class with a rigorous consistency guarantee when confidences (i.e., the class-posterior probabilities for all the classes) are available. Specifically, we propose an empirical risk minimization framework that is loss-/model-/optimizer-independent. Instead of constructing a boundar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 23 publications
0
1
0
Order By: Relevance
“…Candidate Label Confidence. Various weakly supervised learning settings based on confidence data have been studied recently, including positive-confidence data [Ishida et al, 2018], similarityconfidence data [Cao et al, 2021b], confidence data for instance-dependent label-noise [Berthon et al, 2021] and single-class confidence [Cao et al, 2021a]. Here, we propose the following candidate label confidence:…”
Section: Risk Consistent Algorithmmentioning
confidence: 99%
“…Candidate Label Confidence. Various weakly supervised learning settings based on confidence data have been studied recently, including positive-confidence data [Ishida et al, 2018], similarityconfidence data [Cao et al, 2021b], confidence data for instance-dependent label-noise [Berthon et al, 2021] and single-class confidence [Cao et al, 2021a]. Here, we propose the following candidate label confidence:…”
Section: Risk Consistent Algorithmmentioning
confidence: 99%