2020
DOI: 10.48550/arxiv.2007.10976
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Interactive Inference under Information Constraints

Abstract: We consider distributed inference using sequentially interactive protocols. We obtain lower bounds on the minimax sample complexity of interactive protocols under local information constraints, a broad family of resource constraints which captures communication constraints, local differential privacy, and noisy binary queries as special cases. We focus on the inference tasks of learning (density estimation) and identity testing (goodness-of-fit) for discrete distributions under total variation distance, and es… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
22
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
4

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(23 citation statements)
references
References 22 publications
1
22
0
Order By: Relevance
“…1) AHEAD is robust to the changes in data distributions. Recalling the derivation β€’ 6 groups, while HDG separates users into 𝐢 2 3 +𝐢 1 3 groups, where the number of user records in each group of AHEAD is half of that of HDG, i.e., doubling noise error of AHEAD. Therefore, from Figure 8(a) and 8(b), we know that the superiority of AHEAD will decrease with fewer user records.…”
Section: Mse Under Two Expansion Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…1) AHEAD is robust to the changes in data distributions. Recalling the derivation β€’ 6 groups, while HDG separates users into 𝐢 2 3 +𝐢 1 3 groups, where the number of user records in each group of AHEAD is half of that of HDG, i.e., doubling noise error of AHEAD. Therefore, from Figure 8(a) and 8(b), we know that the superiority of AHEAD will decrease with fewer user records.…”
Section: Mse Under Two Expansion Methodsmentioning
confidence: 99%
“…When answering a range query, HIO completely covers the query range by using the minimum number of intervals from different layers. For example, when the users' private attribute domain size |𝐷 | = 8 and tree fanout 𝐡 = 2, the range query [2,7] can be decomposed into intervals [2,3] βˆͺ [4,7]. Then, HIO adds the estimated frequency values of the two intervals above to get the answer of a range query.…”
Section: Hierarchical-interval Optimized (Hio)mentioning
confidence: 99%
See 3 more Smart Citations