Reconstructing the structural connectivity between interacting units from observed activity is a challenge across many different disciplines. The fundamental first step is to establish whether or to what extent the interactions between the units can be considered pairwise and, thus, can be modeled as an interaction network with simple links corresponding to pairwise interactions. In principle this can be determined by comparing the maximum entropy given the bivariate probability distributions to the true joint entropy. In many practical cases this is not an option since the bivariate distributions needed may not be reliably estimated, or the optimization is too computationally expensive. Here we present an approach that allows one to use mutual informations as a proxy for the bivariate probability distributions. This has the advantage of being less computationally expensive and easier to estimate. We achieve this by introducing a novel entropy maximization scheme that is based on conditioning on entropies and mutual informations. This renders our approach typically superior to other methods based on linear approximations. The advantages of the proposed method are documented using oscillator networks and a resting-state human brain network as generic relevant examples. Pairwise measures of dependence such as crosscorrelations (as measured by the Pearson correlation coefficient or covariance matrix) and mutual information are widely used to characterize the interactions within complex systems. They are a key ingredient to techniques such as principal component analysis, empirical orthogonal functions, and functional networks (networks inferred from dynamical time series) [1][2][3]. These techniques are widespread since they provide greatly simplified descriptions of complex systems, and allow for the analysis of what might otherwise be intractable problems [4]. In particular, functional networks have been widely applied in fields such as neuroscience [4,5], genetics [6], and cell physiology [7], as well as in climate research [1,8].In this paper we study how faithfully these measures alone can represent a given system. With the increasing use of functional networks this topic has received much attention recently, and many technical concerns have been brought to light dealing with the inference of these networks. Previous studies have shown that the estimates of the functional networks can be negatively affected by properties of the time series [9][10][11], as well as properties of the measure of association, e.g. crosscorrelations [12][13][14][15]. In this work however, we address a more fundamental question: How well do pairwise measurements represent a system?In principle this can be evaluated using a maximum entropy approach. The corresponding framework was first laid out in [16] and later applied in [17], where they assessed the rationale of only looking at the pairwise correlations between neurons. They examined how well the maximum entropy distribution, consistent with all the pairwise correlations described...