2020
DOI: 10.1214/19-aos1901
|View full text |Cite
|
Sign up to set email alerts
|

Geometrizing rates of convergence under local differential privacy constraints

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
38
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 32 publications
(38 citation statements)
references
References 17 publications
0
38
0
Order By: Relevance
“…To address this shortcoming, the local differential privacy constraint [see, for example, 21, 12, and the references therein] was introduced to provide a setting where analysis must be carried out in such a way that each raw data point is only ever seen by the original data holder. The simplest example of a locally differentially private mechanism is the randomised response [35] used with binary data, but mechanisms have also been developed for tasks such as classification [3], generalised linear modelling [12], empirical risk minimisation [33], density estimation [5], functional estimation [27] and goodness-of-fit testing [4].…”
Section: Introductionmentioning
confidence: 99%
“…To address this shortcoming, the local differential privacy constraint [see, for example, 21, 12, and the references therein] was introduced to provide a setting where analysis must be carried out in such a way that each raw data point is only ever seen by the original data holder. The simplest example of a locally differentially private mechanism is the randomised response [35] used with binary data, but mechanisms have also been developed for tasks such as classification [3], generalised linear modelling [12], empirical risk minimisation [33], density estimation [5], functional estimation [27] and goodness-of-fit testing [4].…”
Section: Introductionmentioning
confidence: 99%
“…Evidently, the privacy condition (1) becomes more restrictive for smaller values of the two parameters α and β. Although Definition 2.1 smoothly bridges the cases β = 0 and β > 0, the classical anonymization techniques used for β = 0 and β > 0 are essentially different: In the case β = 0, Laplace perturbation as well as randomization techniques as considered in [11,21] can be used. In the case β > 0, adding appropriately scaled Gaussian noise has been suggested in [17].…”
Section: Definition Of Approximate Differential Privacymentioning
confidence: 99%
“…At least, using the approach suggested in Subsection 2.3 (with its specializations considered in Propositions 3.3 and 3.4) we relieved ourselves from the drawback of the Laplace method that one can privatize only one functional of the form f (t) for one single t that has to be fixed even before the anonymization. Note that this drawback is, for instance, also present in the mechanisms suggested in [21]. From this point of view, (α, β)differential privacy with strictly positive β via one of these approaches should be preferred.…”
Section: Adaptation To Unknown Smoothnessmentioning
confidence: 99%
“…Although this originates in cryptography, there is a growing statistical literature that aims to explore the constraints of this framework and provide procedures that make optimal use of available data (e.g. Wasserman & Zhou 2010, Duchi et al 2018, Rohde & Steinberger 2020, Cai et al 2021. Work in this area is split between central models of privacy, where there is a third party trusted to collect and analyse data before releasing privatised results, and local models of privacy, where data are randomised before collection.…”
Section: Introductionmentioning
confidence: 99%
“…Duchi et al 2018), nonparametric estimation problems (e.g. Rohde & Steinberger 2020, and change point analysis (e.g. Berrett & Yu 2021), to name but a few.…”
Section: Introductionmentioning
confidence: 99%