The problem of univariate mean change point detection and localization based on a sequence of n independent observations with piecewise constant means has been intensively studied for more than half century, and serves as a blueprint for change point problems in more complex settings. We provide a complete characterization of this classical problem in a general framework in which the upper bound σ 2 on the noise variance, the minimal spacing ∆ between two consecutive change points and the minimal magnitude κ of the changes, are allowed to vary with n. We first show that consistent localization of the change points, when the signal-to-noise ratio κdiverges with n at the rate of at least log(n), we demonstrate that two computationally-efficient change point estimators, one based on the solution to an ℓ 0 -penalized least squares problem and the other on the popular wild binary segmentation algorithm, are both consistent and achieve a localization rate of the order σ 2 κ 2 log(n). We further show that such rate is minimax optimal, up to a log(n) term.