In a recent breakthrough paper [M. Braverman, A. Garg, D. Pankratov, and O. Weinstein, From information to exact communication, STOC'13 Proceedings of the 2013 ACM Symposium on Theory of Computing, ACM, New York, 2013, pp. 151-160.] Braverman et al. developed a local characterization for the zero-error information complexity in the two party model, and used it to compute the exact internal and external information complexity of the 2-bit AND function. In this article, we extend their result to the multiparty number-in-hand model by proving that the generalization of their protocol has optimal internal and external information cost for certain distributions. Our proof has new components, and in particular it fixes some minor gaps in the proof of Braverman et al.
IntroductionAlthough communication complexity has since its birth been witnessing steady and rapid progress, it was not until recently that a focus on an informationtheoretic approach resulted in new and deeper understanding of some of the classical problems in the area. This gave birth to a new area of complexity theory called information complexity. Recall that communication complexity is concerned with minimizing the amount of communication required for players who wish to evaluate a function that depends on their private inputs. Information complexity, on the other hand, is concerned with the amount of information that the communicated bits reveal about the inputs of the players to each other, or to an external observer.One of the important achievements of information complexity is the recent result of [BGPW13] that determines the exact asymptotics of the randomized communication complexity of one of the oldest and most studied problems in communication complexity, set disjointness: lim ε→0 lim n→∞ R ε (DISJ n ) n ≈ 0.4827.(1) * Supported by an NSERC grant.