Two familiar notions of correlation are rediscovered as the extreme operating
points for distributed synthesis of a discrete memoryless channel, in which a
stochastic channel output is generated based on a compressed description of the
channel input. Wyner's common information is the minimum description rate
needed. However, when common randomness independent of the input is available,
the necessary description rate reduces to Shannon's mutual information. This
work characterizes the optimal trade-off between the amount of common
randomness used and the required rate of description. We also include a number
of related derivations, including the effect of limited local randomness, rate
requirements for secrecy, applications to game theory, and new insights into
common information duality.
Our proof makes use of a soft covering lemma, known in the literature for its
role in quantifying the resolvability of a channel. The direct proof
(achievability) constructs a feasible joint distribution over all parts of the
system using a soft covering, from which the behavior of the encoder and
decoder is inferred, with no explicit reference to joint typicality or binning.
Of auxiliary interest, this work also generalizes and strengthens this soft
covering tool.Comment: To appear in IEEE Trans. on Information Theory (submitted Aug., 2012,
accepted July, 2013), 26 pages, using IEEEtran.cl