This paper concerns a class of model selection criteria based on cross-validation techniques and estimative predictive densities. Both the simple or leave-one-out and the multifold or leave-m-out cross-validation procedures are considered. These cross-validation criteria define suitable estimators for the expected Kullback-Liebler risk, which measures the expected discrepancy between the fitted candidate model and the true one. In particular, we shall investigate the potential bias of these estimators, under alternative asymptotic regimes for m. The results are obtained within the general context of independent, but not necessarily identically distributed, observations and by assuming that the candidate model may not contain the true distribution. An application to the class of normal regression models is also presented, and simulation results are obtained in order to gain some further understanding on the behavior of the estimator