We propose CONFIG (CONstrained efFIcient Global Optimization), a simple and effective algorithm for constrained efficient global optimization of expensive black-box functions. In each step, our algorithm solves an auxiliary constrained optimization problem with lower confidence bound (LCB) surrogates as the objective and constraints to get the next sample. Theoretically, we show that our algorithm enjoys the same cumulative regret bound as that in the unconstrained case. In addition, we show that the cumulative constraint violations have upper bounds, in terms of mutual information, that are similar to the cumulative regret bounds of the objective function. For commonly used Mátern and Squared Exponential kernels, our bounds are sublinear and allow us to derive a convergence rate to the optimal solution of the original constrained problem. In addition, our method naturally provides a scheme to declare infeasibility when the original black-box optimization problem is infeasible. Numerical experiments corroborate the effectiveness of our algorithm.