The optimization of an information criterion in a variable selection procedure leads to an additional bias, which can be substantial in sparse, high-dimensional data. The bias can be compensated by applying shrinkage while estimating within the selected models. This paper presents modified information criteria for use in variable selection and estimation without shrinkage. The analysis motivating the modified criteria follows two routes. The first, explored for signal-plus-noise observations only, goes by comparison of estimators with and without shrinkage. The second, discussed for general regression models, describes the optimization or selection bias as a double-sided effect, named a mirror effect in the paper: among the numerous insignificant variables, those with large, noisy values present themselves as being more valuable than an arbitrary variable, while in fact, they carry more noise than an arbitrary variable. The mirror effect is developed for Akaike's Information Criterion and for Mallows' C p , with special attention to the latter criterion as a stopping rule in a least angle regression routine. The result is a new stopping rule, not focusing on the quality of a lasso shrinkage selection, but on the least squares estimator without shrinkage within the same selection.