2021
DOI: 10.48550/arxiv.2106.10068
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Differentially Private Sparse Vectors with Low Error, Optimal Space, and Fast Access

Abstract: Representing a sparse histogram, or more generally a sparse vector, is a fundamental task in differential privacy. An ideal solution would use space close to information-theoretical lower bounds, have an error distribution that depends optimally on the desired privacy level, and allow fast random access to entries in the vector. However, existing approaches have only achieved two of these three goals.In this paper we introduce the Approximate Laplace Projection (ALP) mechanism for approximating π‘˜-sparse vecto… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(5 citation statements)
references
References 9 publications
0
5
0
Order By: Relevance
“…(3) We extend our results to the continual release setting. (4) We empirically validate our results by evaluating DPExpGK, analyzing and comparing various aspects of performance on real-world and synthetic datasets.…”
Section: Our Contributionsmentioning
confidence: 81%
See 2 more Smart Citations
“…(3) We extend our results to the continual release setting. (4) We empirically validate our results by evaluating DPExpGK, analyzing and comparing various aspects of performance on real-world and synthetic datasets.…”
Section: Our Contributionsmentioning
confidence: 81%
“…Differentially private statistical estimation: Estimation of global data statistics (or more generally, inference) is in general an important use-case of differential privacy [34]. The related private histogram release problem was studied by [4,5,10,38] and in the continual observation privacy setting [11,12,21]. Perrier et al [47] make improvements over prior work for private release of some statistics such as the moving average of a stream when the data distribution is light-tailed.…”
Section: Differential Privacymentioning
confidence: 99%
See 1 more Smart Citation
“…Sparse vector data releasing under central-DP. Previous works [5,17,30] discussed a related setting that a single entity wishes to differentially privately release a k-sparse vector v ∈ [0, u] d (u can be large). The neighboring notion is also defined by L 1 distance -a neighboring input pair v ∼ v iff.…”
Section: Additional Related Workmentioning
confidence: 99%
“…v βˆ’v 1 ≀ 1. For example, the newest work on this line -the ALP mechanism [5] showed how to privately encode the k-sparse vector with O(k log(d + u)) bits with L ∞ decoding error of O( log d ). However, the encoding-decoding processes of these works are biased.…”
Section: Additional Related Workmentioning
confidence: 99%