2006
DOI: 10.1007/11681878_14
|View full text |Cite
|
Sign up to set email alerts
|

Calibrating Noise to Sensitivity in Private Data Analysis

Abstract: Abstract. We continue a line of research initiated in [10, 11] on privacypreserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function f mapping databases to reals, the so-called true answer is the result of applying f to the database. To protect privacy, the true answer is perturbed by the addition of random noise generated according to a carefully chosen distribution, and this response, the true answer plus noise, is returned to the user.P… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

14
6,190
0
15

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 5,380 publications
(6,219 citation statements)
references
References 11 publications
14
6,190
0
15
Order By: Relevance
“…The methodology of differential privacy [6,7] has provided a strong definition of privacy which in some settings, using a mechanism of doubly-exponential noise addition, also allows for extraction of informative statistics from databases. A recent paper by Barak et al [1] extends this approach to the release of a specified set of margins from a multi-way contingency table.…”
Section: Introductionmentioning
confidence: 99%
“…The methodology of differential privacy [6,7] has provided a strong definition of privacy which in some settings, using a mechanism of doubly-exponential noise addition, also allows for extraction of informative statistics from databases. A recent paper by Barak et al [1] extends this approach to the release of a specified set of margins from a multi-way contingency table.…”
Section: Introductionmentioning
confidence: 99%
“…Because differential privacy protects against a class of attackers the security guarantee is formally stronger. For this reason, algorithms for differential privacy [9] would therefore randomise scripts that are already t-private (but not -adversarial private) thus reducing their usability. In our fingerprinting context, we exploit the attacker's a priori knowledge for synthesising a channel that is t-private with minimal randomisation.…”
Section: Definition 5 a Channel C = (C I O) Is -Adversarially Privmentioning
confidence: 99%
“…The server's goal is to output a value which is close to f (x) but which reveals almost no information about any single individual. Recently, the latter notion has been made precise via the concept of differential privacy [9]. A standard way of obtaining such guarantees is to ask users to submit only Lipschitz functions 4 , and have the server output f (x) plus some random noise depending on the desired privacy guarantee [9].…”
Section: Introductionmentioning
confidence: 99%
“…Recently, the latter notion has been made precise via the concept of differential privacy [9]. A standard way of obtaining such guarantees is to ask users to submit only Lipschitz functions 4 , and have the server output f (x) plus some random noise depending on the desired privacy guarantee [9]. However, if a malicious user submits a function which is not Lipschitz, the differential privacy guarantee is lost.…”
Section: Introductionmentioning
confidence: 99%