We prove new lower bounds for learning intersections of halfspaces, one of the most important concept classes in computational learning theory. Our main result is that any statistical-query algorithm for learning the intersection of √ n halfspaces in n dimensions must make 2 Ω( √ n) queries. This is the first non-trivial lower bound on the statistical query dimension for this concept class (the previous best lower bound was n Ω(log n) ). Our lower bound holds even for intersections of low-weight halfspaces. In the latter case, it is nearly tight.We also show that the intersection of two majorities (low-weight halfspaces) cannot be computed by a polynomial threshold function (PTF) with fewer than n Ω(log n/ log log n) monomials. This is the first super-polynomial lower bound on the PTF length of this concept class, and is nearly optimal. For intersections of k = ω(log n) low-weight halfspaces, we improve our lower bound to min{2 Ω( √ n) , n Ω(k/ log k) }, which too is nearly optimal. As a consequence, intersections of even two halfspaces are not computable by polynomial-weight PTFs, the most expressive class of functions known to be efficiently learnable via Jackson's Harmonic Sieve algorithm. Finally, we report our progress on the weak learnability of intersections of halfspaces under the uniform distribution.