We study the statistical performance of semidefinite programming (SDP) relaxations for clustering under random graph models. Under the Z 2 Synchronization model, Censored Block Model and Stochastic Block Model, we show that SDP achieves an error rate of the formHeren is an appropriate multiple of the number of nodes and I * is an information-theoretic measure of the signal-to-noise ratio. We provide matching lower bounds on the Bayes error for each model and therefore demonstrate that the SDP approach is Bayes optimal. As a corollary, our results imply that SDP achieves the optimal exact recovery threshold under each model. Furthermore, we show that SDP is robust: the above bound remains valid under semirandom versions of the models in which the observed graph is modified by a monotone adversary. Our proof is based on a novel primal-dual analysis of SDP under a unified framework for all three models, and the analysis shows that SDP tightly approximates a joint majority voting procedure.