We introduce a new information theoretic measure that we call Public Information Complexity (PIC), as a tool for the study of multi-party computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. We are able to use this measure directly in the natural asynchronous messagepassing peer-to-peer model and show a number of interesting properties and applications of our new notion: the Public Information Complexity is a lower bound on the Communication Complexity and an upper bound on the Information Complexity; the difference between the Public Information Complexity and the Information Complexity provides a lower bound on the amount of randomness used in a protocol; any communication protocol can be compressed to its Public Information Cost; an explicit calculation of the zero-error Public Information Complexity of the k-party, n-bit Parity function, where a player outputs the bit-wise parity of the inputs. The latter result also establishes that the amount of randomness needed by a private protocol that computes this function is Ω(n).Our main goal is to introduce novel information-theoretical measures for the study of number-in-hand, message-passing multi-party protocols, coupled with a natural model that, among other things, allows private protocols (which is not the case for, e.g., the coordinator model).We define the new measure of Public Information Complexity (PIC), as a tool for the study of multiparty computation protocols, and of quantities such as their communication complexity, or the amount of randomness they require in the context of information-theoretic private computations. Intuitively, our new measure captures a combination of the amount of information about the inputs that the players leak to other players, and the amount of randomness that the protocol uses. By proving lower bounds on PIC for a given multi-party function f , we are able to give lower bounds on the multi-party communication complexity of f and on the amount of randomness needed to privately compute f . The crucial point is that the PIC of functions, in our multi-party model, is not always 0, unlike their IC.Our new measure works in a model which is a slight restriction of the most general asynchronous model, where, for a given player at a given time, the set of players from which that player waits for a message can be determined by that player's own local view. This allows us to have the property that for any protocol, the information which is leaked during the execution of the protocol is at most the communication cost of the protocol. Note that in the multi-party case, the information cost of a protocol may be higher than its communication cost, because the identity of the player from which one receives a message might carry some information. We are able to define our measure and use it directly in a natural asynchronous peer-to-peer model (and not, e.g., in the coordinator model used in most works studying...