We consider the problem of distributed source simulation with no communication, in which Alice and Bob observe sequences U n and V n respectively, drawn from a joint distribution p ⊗n U V , and wish to locally generate sequences X n and Y n respectively with a joint distribution that is close (in KL divergence) to p ⊗n XY . We provide a single-letter condition under which such a simulation is asymptotically possible with a vanishing KL divergence. Our condition is nontrivial only in the case where the Gàcs-Körner (GK) common information between U and V is nonzero, and we conjecture that only scalar Markov chains X − U − V − Y can be simulated otherwise. Motivated by this conjecture, we further examine the case where both pUV and pXY are doubly symmetric binary sources with parameters p, q ≤ 1/2 respectively. While it is trivial that in this case p ≤ q is both necessary and sufficient, we show that when p is close to q then any successful simulation is close to being scalar in the total variation sense.
Figure 2: Digital solutionThis digital approach is viable only when C GK (U ; V ) > 0. There is an even simpler analog approach that does not use common information -Alice and Bob pass their corresponding sequences through memoryless channels p X n |U n = p ⊗n X|U and p Y n |V n = p ⊗n Y |V , respectively, symbol-by-symbol.Proposition 2 (analog solution). If X − U − V − Y form a Markov chain, then p XY is simulable from p UV .