2021
DOI: 10.1609/aaai.v35i9.16953
|View full text |Cite
|
Sign up to set email alerts
|

Predictive Adversarial Learning from Positive and Unlabeled Data

Abstract: This paper studies learning from positive and unlabeled examples, known as PU learning. It proposes a novel PU learning method called Predictive Adversarial Networks (PAN) based on GAN (Generative Adversarial Networks). GAN learns a generator to generate data (e.g., images) to fool a discriminator which tries to determine whether the generated data belong to a (positive) training class. PU learning can be casted as trying to identify (not generate) likely positive instances from the unlabeled set to fool a dis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(16 citation statements)
references
References 29 publications
0
16
0
Order By: Relevance
“…An intrinsic challenge for this class of problems is that the number of true positives, i.e. the prior class distribution [86][87][88][89][90][91] , is unknown and most classifiers require labels for training. Motivated by the robustness and the performance of ensemble approaches such as bagging in PU learning 39,86,87 , we develop a statistical approach to separate candidate genes from non-candidate genes using an ensemble approach 87,88,92 which eliminates the need to predefine 39 or estimate 88 a prior class distribution or to choose an arbitrary cut-off 40,42 on predicted rank distributions.…”
Section: The Ensemble Approachmentioning
confidence: 99%
“…An intrinsic challenge for this class of problems is that the number of true positives, i.e. the prior class distribution [86][87][88][89][90][91] , is unknown and most classifiers require labels for training. Motivated by the robustness and the performance of ensemble approaches such as bagging in PU learning 39,86,87 , we develop a statistical approach to separate candidate genes from non-candidate genes using an ensemble approach 87,88,92 which eliminates the need to predefine 39 or estimate 88 a prior class distribution or to choose an arbitrary cut-off 40,42 on predicted rank distributions.…”
Section: The Ensemble Approachmentioning
confidence: 99%
“…For examples, GenPU in (Hou et al 2018) proposed to train an array of generators and discriminators to distinguish between the positive and negative instances in the unlabelled datasets. PAN (Hu et al 2021) proposed to train the PU-learning classifier under the GAN framework by viewing instances selected by the classifier as the generated instances. However, these works generally focus on how to obtain a better classifier/discriminator rather than how to generate high-quality desired instances.…”
Section: Related Workmentioning
confidence: 99%
“…Predictive adversarial learning were introduced by [20,42]. They used generator to produce data to fool a discriminator which can determine the generated data is positive or not.…”
Section: Related Work 21 Positive-unlabeled (Pu) Learningmentioning
confidence: 99%
“…The generated data is used to train binary classifier. PAN [20] proposes a new objective based on KL-distance, and optimize the architecture of GAN. In this work, we proposed a PUtree algorithm which splits PU instances into hierarchical communities, and then a PU path fusion network is deployed to aggregate different level community information, which can be vital for various PU tasks, especially for chronic disease prediction.…”
Section: Related Work 21 Positive-unlabeled (Pu) Learningmentioning
confidence: 99%