This paper introduces a new converse machinery for a challenging class of distributed source-type problems (e.g. distributed source coding, common randomness generation, or hypothesis testing with communication constraints), through the example of the Wyner-Ahlswede-Körner network. Using the functional-entropic duality and the reverse hypercontractivity of the transposition semigroup, we lower bound the error probability for each joint type. Then by averaging the error probability over types, we lower bound the c-dispersion (which characterizes the secondorder behavior of the weighted sum of the rates of the two compressors when a nonvanishing error probability is small) as the variance of the gradient of infP U |X tcHpY |U q`IpU ; Xqu with respect to QXY , the per-letter side information and source distribution. In comparison, using standard achievability arguments based on the method of types, we upper-bound the c-dispersion as the variance of cı Y |U pY |U q`ıU;XpU ; Xq, which improves the existing upper bounds but has a gap to the aforementioned lower bound. September 14, 2018 DRAFT