Consider the noisy underdetermined system of linear equations: y = Ax 0 + z 0 , with n × N measurement matrix A, n < N , and Gaussian white noise z 0 ∼ N(0, σ 2 I). Both y and A are known, both x 0 and z 0 are unknown, and we seek an approximation to x 0 . When x 0 has few nonzeros, useful approximations are often obtained by 1 -penalized 2 minimization, in which the reconstructionx 1,λ solves min y − Ax 2 2 /2 + λ x 1 . Evaluate performance by mean-squared error (MSE = E||xConsider matrices A with iid Gaussian entries and a large-system limit in which n, N → ∞ with n/N → δ and k/n → ρ. Call the ratio MSE/σ 2 the noise sensitivity. We develop formal expressions for the MSE ofx 1,λ , and evaluate its worst-case formal noise sensitivity over all types of k-sparse signals. The phase space 0 ≤ δ, ρ ≤ 1 is partitioned by curve ρ = ρ MSE (δ) into two regions. Formal noise sensitivity is bounded throughout the region ρ < ρ MSE (δ) and is unbounded throughout the region ρ > ρ MSE (δ).The phase boundary ρ = ρ MSE (δ) is identical to the previously-known phase transition curve for equivalence of 1 − 0 minimization in the k-sparse noiseless case. Hence a single phase boundary describes the fundamental phase transitions both for the noiseless and noisy cases.Extensive computational experiments validate the predictions of this formalism, including the existence of game theoretical structures underlying it (saddlepoints in the payoff, least-favorable signals and maximin penalization).Underlying our formalism is an approximate message passing soft thresholding algorithm (AMP) introduced earlier by the authors. Other papers by the authors detail expressions for the formal MSE of AMP and its close connection to 1 -penalized reconstruction. Here we derive the minimax formal MSE of AMP and then read out results for 1 -penalized reconstruction.