Basis Pursuit (BP), Basis Pursuit DeNoising (BPDN), and LASSO are popular methods for identifying important predictors in the high-dimensional linear regression model Y = Xβ + ε. By definition, when ε = 0, BP uniquely recovers β when Xβ = Xb and β = b implies b 1 > β 1 (identifiability condition).Furthermore, LASSO can recover the sign of β only under a much stronger irrepresentability condition.Meanwhile, it is known that the model selection properties of LASSO can be improved by hard-thresholding its estimates. This article supports these findings by proving that thresholded LASSO, thresholded BPDN and thresholded BP recover the sign of β in both the noisy and noiseless cases if and only if β is identifiable and large enough. In particular, if X has iid Gaussian entries and the number of predictors grows linearly with the sample size, then these thresholded estimators can recover the sign of β when the signal sparsity is asymptotically below the Donoho-Tanner transition curve. This is in contrast to the regular LASSO, which asymptotically, recovers the sign of β only when the signal sparsity tends to 0. Numerical experiments show that the identifiability condition, unlike the irrepresentability condition, does not seem to be affected by the structure of the correlations in the X matrix.