We describe an optimal randomized MapReduce algorithm for the problem of triangle enumeration that requires $\BO{E^{3/2}/(M\sqrt m)}$ rounds, where $m$ denotes the expected memory size of a reducer and $M$ the total available space. This generalizes the well-known vertex partitioning approach proposed in (Suri and Vassilvitskii, 2011) to multiple rounds, significantly increasing the size of the graphs that can be handled on a given system. We also give new theoretical (high probability) bounds on the work needed in each reducer, addressing the ``curse of the last reducer''. Indeed, our work is the first to give guarantees on the maximum load of each reducer for an arbitrary input graph. Our experimental evaluation shows the scalability of our approach, that it is competitive with existing methods improving the performance by a factor up to $2\times$, and that it can significantly increase the size of datasets that can be processed