We take a closer look at the structure of bivariate dependency induced by a pair of predictor random variables (X 1 , X 2 ) trying to synergistically, redundantly or uniquely encode a target random variable Y. We evaluate a recently proposed measure of redundancy based on the Gács and Körner's common information (Griffith et al., Entropy 2014, 16, 1985-2000 and show that the measure, in spite of its elegance is degenerate for most non-trivial distributions. We show that Wyner's common information also fails to capture the notion of redundancy as it violates an intuitive monotonically non-increasing property. We identify a set of conditions when a conditional version of Gács and Körner's common information is an ideal measure of unique information. Finally, we show how the notions of approximately sufficient statistics and conditional information bottleneck can be used to quantify unique information.