2021
DOI: 10.29012/jpc.781
|View full text |Cite
|
Sign up to set email alerts
|

Interaction is Necessary for Distributed Learning with Privacy or Communication Constraints

Abstract: Local differential privacy (LDP) is a model where users send privatized data to an untrusted central server whose goal it to solve some data analysis task. In the non-interactive version of this model the protocol consists of a single round in which a server sends requests to all users then receives their responses. This version is deployed in industry due to its practical advantages and has attracted significant research interest. Our main result is an exponential lower bound on the number of samples necessar… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(18 citation statements)
references
References 30 publications
1
17
0
Order By: Relevance
“…Shamir [41] studies various estimation tasks under a range of information constraints. Finally, Dagan and Feldman [42] establish a separation between interactive and noninteractive learning for large-margin classifiers, under both local privacy and communication constraints.…”
Section: Prior Workmentioning
confidence: 99%
“…Shamir [41] studies various estimation tasks under a range of information constraints. Finally, Dagan and Feldman [42] establish a separation between interactive and noninteractive learning for large-margin classifiers, under both local privacy and communication constraints.…”
Section: Prior Workmentioning
confidence: 99%
“…The impact of private training on optimization Many recent works have studied the impact of differentially private mechanisms on centralized optimization [26,27,28,61,81,90,231]. Such works are often concerned with developing better differential privacy mechanisms for model training, and for getting tight bounds on the ( , δ) privacy guarantees of such mechanisms.…”
Section: Data Anonymizationmentioning
confidence: 99%
“…[40] studies various estimation tasks under a range of information constraints. Finally, [19] establish a separation between interactive and noninteractive learning for large-margin classifiers, under both local privacy and communication constraints.…”
Section: ) Interactive Testing and Estimation Of Discrete Distributionsmentioning
confidence: 99%
“…The crux of the previous bound is Eq. (19), which relates the Kullback-Leibler divergence to a per-coordinate information quantity…”
Section: Remark 18 (Is the Bound Above Tight?)mentioning
confidence: 99%