2017
DOI: 10.1109/tit.2017.2664813
|View full text |Cite
|
Sign up to set email alerts
|

Information-Theoretic Lower Bounds for Distributed Function Computation

Abstract: Abstract-We derive information-theoretic converses (i.e., lower bounds) for the minimum time required by any algorithm for distributed function computation over a network of pointto-point channels with finite capacity, where each node of the network initially has a random observation and aims to compute a common function of all observations to a given accuracy with a given confidence by exchanging messages with its neighbors. We obtain the lower bounds on computation time by examining the conditional mutual in… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
27
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
4
2
1

Relationship

2
5

Authors

Journals

citations
Cited by 22 publications
(27 citation statements)
references
References 39 publications
0
27
0
Order By: Relevance
“…In accordance with an information-theoretic framework, Xu and Raginsky [18,19] study the fundamental time-step limits of distributed function computation in a constrained probabilistic setting. The lower-and upper-bound results are based on tradeoffs between: (1) the minimal amount of information necessarily extracted about the function value by any accuracyand confidence-constrained algorithm, and (2) the maximal amount of information about the function value obtained by any algorithm within specified time-step and communication bounds.…”
Section: Resultsmentioning
confidence: 99%
“…In accordance with an information-theoretic framework, Xu and Raginsky [18,19] study the fundamental time-step limits of distributed function computation in a constrained probabilistic setting. The lower-and upper-bound results are based on tradeoffs between: (1) the minimal amount of information necessarily extracted about the function value by any accuracyand confidence-constrained algorithm, and (2) the maximal amount of information about the function value obtained by any algorithm within specified time-step and communication bounds.…”
Section: Resultsmentioning
confidence: 99%
“…For instance, midrange [49] and medianof-means [54] estimators have been considered as alternatives to linear estimators under different scenarios to improve performance. Extension of this setup would be of interest for different network topologies, as opposed to the centralized fusion setting of this paper, as in [55]. and then compute the gradient with respect to w to get Σ U θ w − λ1 = 0, which is satisfied iff w = λΣ −1 U θ 1.…”
Section: Discussionmentioning
confidence: 99%
“…which is achieved by W = E[W |X n ]. Thus the non-asymptotic lower bound on the Bayes risk in (16) captures the correct dependence on n, and is off from the true Bayes risk by a constant factor. If we apply the unconditional lower bound (6) to Example 1, we can only get an asymptotic lower bound…”
Section: A Lower Bounds Based On Mutual Information and Small Ball Pmentioning
confidence: 96%
“…For the problem of estimating a real-valued parameter W with respect to the quadratic distortion (w, w) = |w − w| 2 , it can be shown that (see, e.g., [16,Lemma 5]…”
Section: Generalizations Of Fano's Inequalitymentioning
confidence: 99%
See 1 more Smart Citation