Two familiar notions of correlation are rediscovered as the extreme operating points for distributed synthesis of a discrete memoryless channel, in which a stochastic channel output is generated based on a compressed description of the channel input. Wyner's common information is the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal trade-off between the amount of common randomness used and the required rate of description. We also include a number of related derivations, including the effect of limited local randomness, rate requirements for secrecy, applications to game theory, and new insights into common information duality. Our proof makes use of a soft covering lemma, known in the literature for its role in quantifying the resolvability of a channel. The direct proof (achievability) constructs a feasible joint distribution over all parts of the system using a soft covering, from which the behavior of the encoder and decoder is inferred, with no explicit reference to joint typicality or binning. Of auxiliary interest, this work also generalizes and strengthens this soft covering tool.Comment: To appear in IEEE Trans. on Information Theory (submitted Aug., 2012, accepted July, 2013), 26 pages, using IEEEtran.cl
We develop elements of a theory of cooperation and coordination in networks. Rather than considering a communication network as a means of distributing information, or of reconstructing random processes at remote nodes, we ask what dependence can be established among the nodes given the communication constraints. Specifically, in a network with communication rates {R_{i,j}} between the nodes, we ask what is the set of all achievable joint distributions p(x1, ..., xm) of actions at the nodes of the network. Several networks are solved, including arbitrarily large cascade networks. Distributed cooperation can be the solution to many problems such as distributed games, distributed control, and establishing mutual information bounds on the influence of one part of a physical system on another.Comment: Accepted to IEEE Trans. on Info. Theory, 25 pages, 23 eps figure, uses IEEEtran.cl
We establish that the feedback capacity of the trapdoor channel is the logarithm of the golden ratio and provide a simple communication scheme that achieves capacity. As part of the analysis, we formulate a class of dynamic programs that characterize capacities of unifilar finite-state channels. The trapdoor channel is an instance that admits a simple analytic solution.
Differential privacy is a precise mathematical constraint meant to ensure privacy of individual pieces of information in a database even while queries are being answered about the aggregate. Intuitively, one must come to terms with what differential privacy does and does not guarantee. For example, the definition prevents a strong adversary who knows all but one entry in the database from further inferring about the last one. This strong adversary assumption can be overlooked, resulting in misinterpretation of the privacy guarantee of differential privacy. Herein we give an equivalent definition of privacy using mutual information that makes plain some of the subtleties of differential privacy. The mutual-information differential privacy is in fact sandwiched between $\epsilon$-differential privacy and $(\epsilon,\delta)$-differential privacy in terms of its strength. In contrast to previous works using unconditional mutual information, differential privacy is fundamentally related to conditional mutual information, accompanied by a maximization over the database distribution. The conceptual advantage of using mutual information, aside from yielding a simpler and more intuitive definition of differential privacy, is that its properties are well understood. Several properties of differential privacy are easily verified for the mutual information alternative, such as composition theorems.Comment: ACM CCS 2016, 12 page
Two familiar notions of correlation are rediscovered as extreme operating points for simulating a discrete memoryless channel, in which a channel output is generated based only on a description of the channel input. Wyner's "common information" coincides with the minimum description rate needed. However, when common randomness independent of the input is available, the necessary description rate reduces to Shannon's mutual information. This work characterizes the optimal tradeoff between the amount of common randomness used and the required rate of description.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.