2019
DOI: 10.2478/popets-2019-0045
|View full text |Cite
|
Sign up to set email alerts
|

Investigating Statistical Privacy Frameworks from the Perspective of Hypothesis Testing

Abstract: Over the last decade, differential privacy (DP) has emerged as the gold standard of a rigorous and provable privacy framework. However, there are very few practical guidelines on how to apply differential privacy in practice, and a key challenge is how to set an appropriate value for the privacy parameter ɛ. In this work, we employ a statistical tool called hypothesis testing for discovering useful and interpretable guidelines for the state-of-the-art privacy-preserving frameworks. We formalize and implement h… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
25
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 26 publications
(25 citation statements)
references
References 52 publications
0
25
0
Order By: Relevance
“…Privacy leakage attacks: it was understood that adversary of Differential Privacy seek to determine whether the observed outcome is due to one of two neighboring datasets differing by a single data point. While this hypothesis testing interpretation of Differential Privacy was elucidated in [29,19,21,2,3], it is limited by the neighboring datasets assumption and does not apply to aggressive gradient-based attacks recently proposed by [12,34,35,31] and model inversion attacks in [11,18]. These attacks follow the Bayesian restoration framework introduced in the present paper.…”
Section: Related Workmentioning
confidence: 93%
“…Privacy leakage attacks: it was understood that adversary of Differential Privacy seek to determine whether the observed outcome is due to one of two neighboring datasets differing by a single data point. While this hypothesis testing interpretation of Differential Privacy was elucidated in [29,19,21,2,3], it is limited by the neighboring datasets assumption and does not apply to aggressive gradient-based attacks recently proposed by [12,34,35,31] and model inversion attacks in [11,18]. These attacks follow the Bayesian restoration framework introduced in the present paper.…”
Section: Related Workmentioning
confidence: 93%
“…As for the methodology, we will use the framework of statistical hypothesis testing in a similar vein to [7] where the authors determine an appropriate value of the privacy parameter as a function of false alarm and mis-detection probabilities in deciding on the presence or absence of a particular record in a dataset. Similarly, in [8], the author studies the differentially private hypothesis testing in the local setting where users locally add the DP noise on their personal data before submitting it to the dataset.…”
Section: Related Work and Methodologymentioning
confidence: 99%
“…• We derive a trade-off between the privacy protected adversary's advantage and the security of the system for the adversary to remain undetected while giving as much damage as possible to the system or, alternatively, for the defender to preserve the privacy of the system and detect the attacker. This trade-off is defined in the framework of statistical hypothesis testing similarly to [7].…”
Section: Contributionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Liu et al [ 54 ] showed how ε influences the accuracy of differentially private hypothesis testing. They proposed a method to determine an appropriate value for ε that can be useful for determining the ε value for our proposed algorithm; however, determining ε is outside the scope of our paper.…”
Section: Related Workmentioning
confidence: 99%