Computational Complexity Theory is the field that studies the inherent costs of algorithms for solving mathematical problems. Its major goal is to identify the limits of what is efficiently computable in natural computational models. Computational complexity ranges from quantum computing to determining the minimum size of circuits that compute basic mathematical functions to the foundations of cryptography and security. Computational complexity emerged from the combination of logic, combinatorics, information theory, and operations research. It coalesced around the central problem of "P versus NP" (one of the seven open problems of the Clay Institute). While this problem remains open, the field has grown both in scope and sophistication. Currently, some of the most active research areas in computational complexity are: 2 Communication Complexity Communication complexity (first introduced by Yao) has proven to be one of the most fruitful areas of complexity theory. In this model, two or more parties each know of a different input, and wish to communicate some joint function of all of their inputs. The complexity of the function is the number of bits they must exchange in order to determine the value of the function. This model has proven to be extremely useful because on the one hand, the model is very simple, and so many computational processes (such as circuits, streaming computation and many others) can be viewed as executing communication protocols. On the other hand, over the years we have found powerful techniques that can be used to prove lowerbounds on the communication complexity of different types of functions, and so obtain lowerbounds on other computational processes. At this workshop, we discussed a few fundamental recent results about communication complexity. 2.1 Presentation Highlights 2.1.1 Information Complexity Mark Braverman gave an overview of information complexity. Information complexity is closely related to communication complexity, but instead of considering the number of bits exchanged, it studies the amount of information (in Shannon's Information Theory sense) that the parties must exchange to solve the problem. Information complexity has found many applications within communication complexity and related areas. The information complexity of problems has been shown to demonstrate some of the attractive properties of Shannon's entropy. For example, it is additive over independent problems. This additivity allows one to obtain, among other things, tight bounds on the communication complexity of problems. In the talk, Mark discussed the information complexity of the AND function (where Alice and Bob each get a bit and need to output the AND of their inputs). Through known connection, understanding the information complexity of the AND function allows one to get a tight formula of 0.4827. .. n + o(n) for the communication complexity of the Set Disjointness problem: where Alice and Bob are given subsets X and Y of {1,. .. , n} and need to decide whether X ∩ Y = ∅. The talk was largely based...