In this paper, we study the optimal control of a general mean-field stochastic differential equation with constraints. We establish a set of necessary conditions for the optimal control, where the coefficients of the controlled system depend, nonlinearly, on both the state process as well as of its probability law. The control domain is not necessarily convex. The proof of our main result is based on the first-order and second-order derivatives with respect to measure in the Wasserstein space of probability measures, and the variational principle. We prove Peng's type necessary optimality conditions for a general mean-field system under state constraints. Our result generalizes the stochastic maximum principle of Buckdahn et al. [2] to the case with constraints.