2013
DOI: 10.1007/978-3-642-37119-6_25
|View full text |Cite
|
Sign up to set email alerts
|

Information-Theoretic Foundations of Differential Privacy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
22
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 22 publications
(22 citation statements)
references
References 12 publications
0
22
0
Order By: Relevance
“…Proof. This is an instance of Theorem 3 in Mir [24] (first appeared in Tishby et al [31]) by taking the distortion function to be the empirical risk. Note that this is a simple convex optimization over the functions and the proof involves substituting the solution into the optimality condition with a specific Lagrange multiplier chosen to appropriately adjust the normalization constant.…”
Section: Definition 8 (Conditional Entropy) Conditional Entropymentioning
confidence: 91%
See 1 more Smart Citation
“…Proof. This is an instance of Theorem 3 in Mir [24] (first appeared in Tishby et al [31]) by taking the distortion function to be the empirical risk. Note that this is a simple convex optimization over the functions and the proof involves substituting the solution into the optimality condition with a specific Lagrange multiplier chosen to appropriately adjust the normalization constant.…”
Section: Definition 8 (Conditional Entropy) Conditional Entropymentioning
confidence: 91%
“…The variational solution of p(h|Z) that minimizesĨ(h; Z) + γE Z E h|Z L(Z, h) is proportional to exp(−L(Z, h))π(h). This provides an alternative way of seeing the class of algorithms A that we consider.Ĩ(h; Z) can be thought of as a informationtheoretic quantification of privacy loss, as described in [38,24]. As a result, we can think of the class of A that samples from MaxEnt distributions as the most private algorithm among all algorithms that achieves a given utility constraint.…”
Section: Definition 8 (Conditional Entropy) Conditional Entropymentioning
confidence: 99%
“…Firstly, the privacy-preserving mechanism can be modeled as a noisy channel model [13], [17], [23], [24], including the radio channel model [25] connected with IoT devices. Then, the privacy leakage can be measured by entropy [19], [26], [27]. In particular, Alvim et al proposed measure information uncertainty based on the idea of quantitative information flow (QIF).…”
Section: Related Workmentioning
confidence: 99%
“…Intuitively, the more randomness of the privacy mechanism will be obtained the better privacy performance. The works [16], [18], [19] utilize MI measuring the privacy leakage about privacy-preserving mechanisms because the notion of MI has a clear meaning, that is, measuring the amount of uncertainty reduction about original information. However, the works [7], [20] adopt MI as utility metrics, and further preserve the useful information as much as possible.…”
Section: Introductionmentioning
confidence: 99%
“…Although several information-theoretic interpretations of differential privacy have been presented [16][17][18], the available literature does not offer an operational meaning for the concept as well as a systematic approach for setting the differential privacy parameter (except a broad sweep). Further, in practice, it may not be possible to use an additive noise with infinite support as the noise might need to satisfy certain constraints, e.g., it must belong to a bounded set for smart metering [19].…”
Section: Introductionmentioning
confidence: 99%