Entropy and differential entropy are important quantities in information theory. A tractable extension to singular random variables-which are neither discrete nor continuoushas not been available so far. Here, we present such an extension for the practically relevant class of integer-dimensional singular random variables. The proposed entropy definition contains the entropy of discrete random variables and the differential entropy of continuous random variables as special cases. We show that it transforms in a natural manner under Lipschitz functions, and that it is invariant under unitary transformations. We define joint entropy and conditional entropy for integer-dimensional singular random variables, and we show that the proposed entropy conveys useful expressions of the mutual information. As first applications of our entropy definition, we present a result on the minimal expected codeword length of quantized integerdimensional singular sources and a Shannon lower bound for integer-dimensional singular sources.Index Terms-Information entropy, rate distortion theory, Shannon lower bound, singular random variables, source coding.
Let (X, Y) denote n independent, identically distributed copies of two arbitrarily correlated Rademacher random variables (X, Y). We prove that the inequality I(f (X); g(Y)) ≤ I(X; Y) holds for any two Boolean functions: f, g : {−1, 1} n → {−1, 1} (I( • ; •) denotes mutual information). We further show that equality in general is achieved only by the dictator functions f (x) = ±g(x) = ±xi, i ∈ {1, 2, . . . , n}.
We study a novel multi-terminal source coding setup motivated by the biclustering problem. Two separate encoders observe two i.i.d. sequences $X^n$ and $Y^n$, respectively. The goal is to find rate-limited encodings $f(x^n)$ and $g(z^n)$ that maximize the mutual information $\textrm{I}(\,{f(X^n)};{g(Y^n)})/n$. We discuss connections of this problem with hypothesis testing against independence, pattern recognition and the information bottleneck method. Improving previous cardinality bounds for the inner and outer bounds allows us to thoroughly study the special case of a binary symmetric source and to quantify the gap between the inner and the outer bound in this special case. Furthermore, we investigate a multiple description (MD) extension of the CEO problem with mutual information constraint. Surprisingly, this MD-CEO problem permits a tight single-letter characterization of the achievable region.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.