A basic problem in information theory is the following: Let P = (X, Y) be an arbitrary distribution where the marginals X and Y are (potentially) correlated. Let Alice and Bob be two players where Alice gets samples {x i } i≥1 and Bob gets samples {y i } i≥1 and for all i, (x i , y i ) ∼ P. What joint distributions Q can be simulated by Alice and Bob without any interaction?Classical works in information theory by Gács-Körner and Wyner answer this question when at least one of P or Q is the distribution Eq (Eq is defined as uniform over the points (0, 0) and (1, 1)). However, other than this special case, the answer to this question is understood in very few cases. Recently, Ghazi, Kamath and Sudan showed that this problem is decidable for Q supported on {0, 1} × {0, 1}. We extend their result to Q supported on any finite alphabet. Moreover, we show that If Q can be simulated, our algorithm also provides a (non-interactive) simulation protocol.We rely on recent results in Gaussian geometry (by the authors) as well as a new smoothing argument inspired by the method of boosting from learning theory and potential function arguments from complexity theory and additive combinatorics.