Information theoretically secure multi-party computation (MPC) has been a central primitive of modern cryptography. However, relatively little is known about the communication complexity of this primitive.In this work, we develop powerful information theoretic tools to prove lower bounds on the communication complexity of MPC. We restrict ourselves to a concrete setting involving 3-parties, in order to bring out the power of these tools without introducing too many complications. Our techniques include the use of a data processing inequality for residual information -i.e., the gap between mutual information and Gács-Körner common information, a new information inequality for 3-party protocols, and the idea of distribution switching by which lower bounds computed under certain worst-case scenarios can be shown to apply for the general case.Using these techniques we obtain tight bounds on communication complexity by MPC protocols for various interesting functions. In particular, we show concrete functions which have "communication-ideal" protocols, which achieve the minimum communication simultaneously on all links in the network. Also, we obtain the first explicit example of a function that incurs a higher communication cost than the input length, in the secure computation model of Feige, Kilian and Naor [FKN94], who had shown that such functions exist. We also show that our communication bounds imply tight lower bounds on the amount of randomness required by MPC protocols for many interesting functions.We identify a multi-secret sharing primitive that is interesting on its own right, but also has the property that lower bounds on its share sizes serve as lower bounds for communication complexity of MPC protocols. While often the resulting bounds are tight, we can use our results to give a concrete example where there is a gap between the share sizes and the communication complexity.