2015
DOI: 10.1145/2699501
|View full text |Cite
|
Sign up to set email alerts
|

Optimizing Batch Linear Queries under Exact and Approximate Differential Privacy

Abstract: Differential privacy is a promising privacy-preserving paradigm for statistical query processing over sensitive data. It works by injecting random noise into each query result such that it is provably hard for the adversary to infer the presence or absence of any individual record from the published noisy results. The main objective in differentially private query processing is to maximize the accuracy of the query results while satisfying the privacy guarantees. Previous work, notably Li et al. [2010], has su… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2016
2016
2025
2025

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 26 publications
(21 citation statements)
references
References 60 publications
0
21
0
Order By: Relevance
“…This section experimentally evaluates the effectiveness of the proposed convex optimization algorithm COA for linear aggregate processing under approximate differential privacy. We compare COA with six existing methods: Gaussian Mechanism (GM) [16], Wavelet Mechanism (WM) [29], Hierarchical Mechanism (HM) [8], Exponential Smoothing Mechanism (ESM) [30,13], Adaptive Mechanism (AM) [14,13] and Low-Rank Mechanism (LRM) [30,31]. Qardaji et al [23] proposed an improved version of HM by carefully selecting the branching factor.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…This section experimentally evaluates the effectiveness of the proposed convex optimization algorithm COA for linear aggregate processing under approximate differential privacy. We compare COA with six existing methods: Gaussian Mechanism (GM) [16], Wavelet Mechanism (WM) [29], Hierarchical Mechanism (HM) [8], Exponential Smoothing Mechanism (ESM) [30,13], Adaptive Mechanism (AM) [14,13] and Low-Rank Mechanism (LRM) [30,31]. Qardaji et al [23] proposed an improved version of HM by carefully selecting the branching factor.…”
Section: Methodsmentioning
confidence: 99%
“…As shown in [13,14,30,31,11], different strategy queries lead to different overall accuracy for the original workload. Hence, an important problem in batch processing under differential privacy is to find a suitable strategy that leads to the highest accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…These methods use past/auxiliary information to improve the utility of the query answers. Examples are the matrix mechanism [58,59], the multiplicative weights mechanism [37,41], the low-rank mechanism [99,100], correlated noise [75,93], least-square estimation [75], boosting [26], and the sparse vector technique [25,64]. We also note that some of these adaptive methods can be used in the restricted case of matrix-valued query where the matrix-valued query can be decomposed into multiple linear vector-valued queries [41,42,74,75,98,100].…”
Section: Adaptivementioning
confidence: 99%
“…where the variance of Lap(λ ) is 2λ 2 = 2∆Q 2 ε 2 . Since the Laplace noise injected into each of the m query result is independent, the overall expected squared error of the query answers obtained by the Laplace mechanism is 2m∆Q 2 ε 2 [16]. Remark 3.3: Note that the amount of error only depends on the sensitivity of the queries, regardless of the records in dataset D. Therefore, we can obtain the desired noise signal to ensure differential privacy (denoted as P DPPV (k)) by sampling through f (x) (defined in (5) and (4)), where k represents time step.…”
Section: A Differential Privacy Noise Calculationmentioning
confidence: 99%