This paper motivates the multiterminal secret-key capacity as a useful measure of multivariate mutual information, develops information-theoretic properties of this measure, and makes comparisons with other existing multivariate correlation measures.ABSTRACT | The capacity for multiterminal secret-key agreement inspires a natural generalization of Shannon's mutual information from two random variables to multiple random variables. Under a general source model without helpers, the capacity is shown to be equal to the normalized divergence from the joint distribution of the random sources to the product of marginal distributions minimized over partitions of the random sources. The mathematical underpinnings are the works on co-intersecting submodular functions and the principle lattices of partitions of the Dilworth truncation. We clarify the connection to these works and enrich them with information-theoretic interpretations and properties that are useful in solving other related problems in information theory as well as machine learning.
This paper stands at the intersection of two distinct lines of research. One line is "holographic algorithms," a powerful approach introduced by Valiant for solving various counting problems in computer science; the other is "normal factor graphs," an elegant framework proposed by Forney for representing codes defined on graphs. We introduce the notion of holographic transformations for normal factor graphs, and establish a very general theorem, called the generalized Holant theorem, which relates a normal factor graph to its holographic transformation. We show that the generalized Holant theorem on the one hand underlies the principle of holographic algorithms, and on the other hand reduces to a general duality theorem for normal factor graphs, a special case of which was first proved by Forney. In the course of our development, we formalize a new semantics for normal factor graphs, which highlights various linear algebraic properties that potentially enable the use of normal factor graphs as a linear algebraic tool.
Abstract-We formulate an info-clustering paradigm based on a multivariate information measure, called multivariate mutual information, that naturally extends Shannon's mutual information between two random variables to the multivariate case involving more than two random variables. With proper model reductions, we show that the paradigm can be applied to study the human genome and connectome in a more meaningful way than the conventional algorithmic approach. Not only can infoclustering provide justifications and refinements to some existing techniques, but it also inspires new computationally feasible solutions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.