We provide a new framework for establishing strong lower bounds on the nonnegative rank of matrices by means of common information, a notion previously introduced in [1]. Common information is a natural lower bound for the nonnegative rank of a matrix and by combining it with Hellinger distance estimations we can compute the (almost) exact common information of UDISJ partial matrix. We also establish robustness of this estimation under various perturbations of the UDISJ partial matrix, where rows and columns are randomly or adversarially removed or where entries are randomly or adversarially altered. This robustness translates, via a variant of Yannakakis' Factorization Theorem, to lower bounds on the average case and adversarial approximate extension complexity. We present the first family of polytopes, the hard pair introduced in [2] related to the CLIQUE problem, with high average case and adversarial approximate extension complexity. We also provide an information theoretic variant of the fooling set method that allows us to extend fooling set lower bounds from extension complexity to approximate extension complexity.