Information-theoretic quantities reveal dependencies among variables in the structure of joint, marginal, and conditional entropies, but leave some fundamentally different systems indistinguishable. Furthermore, there is no consensus on how to construct and interpret a higher-order generalisation of mutual information (MI). In this manuscript, we show that a recently proposed model-free definition of higher-order interactions amongst binary variables (MFIs), like mutual information, is a Möbius inversion on a Boolean algebra, but of surprisal instead of entropy. is gives an information-theoretic interpretation to the MFIs, and by extension to Ising interactions. We study the dual objects to MI and MFIs on the order-reversed la ice, and find that dual MI is related to the previously studied differential mutual information, while dual interactions (outeractions) are interactions with respect to a different background state. Unlike (dual) mutual information, in-and outeractions uniquely identify all six 2-input logic gates, the dy-and triadic distributions, and different causal dynamics that are identical in terms of their Shannon-information content.