2016 Information Theory and Applications Workshop (ITA) 2016
DOI: 10.1109/ita.2016.7888175
|View full text |Cite
|
Sign up to set email alerts
|

On privacy-utility tradeoffs for constrained data release mechanisms

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
35
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 58 publications
(35 citation statements)
references
References 11 publications
0
35
0
Order By: Relevance
“…The highest privacy standard that a data disclosure strategy can guarantee, called perfect privacy, corresponds to when nothing can be learned about an individual that could not have been learned without the disclosed data anyway [4]. While studied in [5], [6], perfect privacy is often disregarded for being too restrictive, corresponding to a extreme choice within the trade-off that exists between privacy and utility [7], [8]. The most popular approach that takes advantage of this trade-off is differential privacy [9], which is equipped with free parameters that can be flexibly tuned in order to adapt to the requirements of diverse scenarios.…”
Section: Arxiv:190401711v1 [Csit] 2 Apr 2019mentioning
confidence: 99%
See 1 more Smart Citation
“…The highest privacy standard that a data disclosure strategy can guarantee, called perfect privacy, corresponds to when nothing can be learned about an individual that could not have been learned without the disclosed data anyway [4]. While studied in [5], [6], perfect privacy is often disregarded for being too restrictive, corresponding to a extreme choice within the trade-off that exists between privacy and utility [7], [8]. The most popular approach that takes advantage of this trade-off is differential privacy [9], which is equipped with free parameters that can be flexibly tuned in order to adapt to the requirements of diverse scenarios.…”
Section: Arxiv:190401711v1 [Csit] 2 Apr 2019mentioning
confidence: 99%
“…Another related problem to the one considered here is the privacy funnel, in which the goal is to reveal the data set X within a given accuracy under some utility measure, while keeping the latent variable W as private as possible [18]. Also, various metrics for quantifying the quality of the disclosure strategy has been studied in [6], [8], [19], [20].…”
Section: B Scenario and Related Workmentioning
confidence: 99%
“…The authors in [10] used the notion of self-information cost to design optimal randomized privacy filters for improving the privacy of a (private) random variable correlated with a public random variable. The interested reader is referred to [11], [12], [13] and references therein for a detailed investigation of the information theoretic approaches to data privacy problem.…”
Section: Related Workmentioning
confidence: 99%
“…A wide variety of methods to statistically quantify and address privacy have been proposed, such as k-anonymity [30], L-diversity [18], t-closeness [16], and differential privacy [5]. In our work, we focus on an information-theoretic approach where privacy is quantified by the mutual information between the data release and the sensitive information [35], [27], [3], [28], [2].…”
Section: Introductionmentioning
confidence: 99%
“…We build upon the non-asymptotic, information-theoretic framework introduced by [27], [3], where the sensitive and useful data are respectively modeled as random variables X and Y . We also adopt the extension considered in [2], where only a (potentially partial and/or noisy) observation W of the data is available. In this framework, the design of the privacy-preserving mechanism to release Z is formulated as the optimization of the tradeoff between minimizing privacy-leakage quantified by the mutual information I(X; Z) and minimizing an expected distortion E[d(Y, Z)].…”
Section: Introductionmentioning
confidence: 99%