2019 57th Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2019
DOI: 10.1109/allerton.2019.8919758
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Adversarial Networks

Abstract: We propose a data-driven framework for optimizing privacy-preserving data release mechanisms to attain the information-theoretically optimal tradeoff between minimizing distortion of useful data and concealing specific sensitive information. Our approach employs adversarially-trained neural networks to implement randomized mechanisms and to perform a variational approximation of mutual information privacy. We validate our Privacy-Preserving Adversarial Networks (PPAN) framework via proof-of-concept experiments… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
50
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 59 publications
(50 citation statements)
references
References 23 publications
0
50
0
Order By: Relevance
“…The discriminator aims to maximize an objective function in minimax game that the generator aims to minimize. GANs have also been applied for enhancing privacy [9,14]. For example, to protect health records, synthetic medical datasets can be published instead of the real ones using generative models training on sensitive real-world medical datasets [3,6].…”
Section: Related Work and Discussionmentioning
confidence: 99%
“…The discriminator aims to maximize an objective function in minimax game that the generator aims to minimize. GANs have also been applied for enhancing privacy [9,14]. For example, to protect health records, synthetic medical datasets can be published instead of the real ones using generative models training on sensitive real-world medical datasets [3,6].…”
Section: Related Work and Discussionmentioning
confidence: 99%
“…For example, Li and Oechtering [40] proposed a new privacy metric based on distributed Bayesian detection which can inform privacyaware system design. Recently, Tripathy et al [41] and Huang et al [42] used adversarial networks for designing privacy-assuring mappings that navigate the PUT. Takbiri et al [43] considered obfuscation and anonymization techniques and characterized the conditions required to obtain perfect privacy.…”
Section: Related Workmentioning
confidence: 99%
“…where P indicates privacy, D and D represent any two neighbouring datasets that have only a single different element, T denotes a set of tuples, and represents the privacy budget. Whereas, the privacy budget is an important factor in differential privacy which ranges from 0 (minimum-) to 1 (maximum-) [14].…”
Section: Differential Privacymentioning
confidence: 99%