2021
DOI: 10.48550/arxiv.2102.01258
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Local Differential Privacy Is Equivalent to Contraction of $E_γ$-Divergence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(6 citation statements)
references
References 0 publications
2
4
0
Order By: Relevance
“…We can also bound the hockey-stick divergence for any γ ≥ 1 directly by the trace-distance, leading to an alternative way of bounding the contraction coefficients. This generalizes the analog classical result in [3].…”
Section: The Quantum Hockey-stick Divergencesupporting
confidence: 85%
See 3 more Smart Citations
“…We can also bound the hockey-stick divergence for any γ ≥ 1 directly by the trace-distance, leading to an alternative way of bounding the contraction coefficients. This generalizes the analog classical result in [3].…”
Section: The Quantum Hockey-stick Divergencesupporting
confidence: 85%
“…In fact from the properties of E γ one can see that this condition is restrictive enough to imply a bound on the trace distance contraction coefficient. This generalizes a classical result in [3]. Now the corollary follows because from Lemma 2.4 we get…”
Section: Local Quantum Differential Privacysupporting
confidence: 80%
See 2 more Smart Citations
“…We give a more formal definition below, but we observe here that the notion of f -divergence generalizes common divergences including total variation, KL-divergence, Renyi divergences, and E γ divergence [Polyanskiy and Wu, 2022+, Van Erven and Harremos, 2014, Asoodeh et al, 2021. We will make the assumption that for some convex f , the source and target measures satisfy D f (ν||µ) < ∞ and ask what the sample complexity of ε-approximate rejection sampling is under this constraint.…”
Section: Introductionmentioning
confidence: 99%