Divergence functions are interesting discrepancy measures. Even though they are not true distances, we can use them to measure how separated two points are. Curiously enough, when they are applied to random variables, they lead to a notion of best predictor that coincides with usual best predictor in Euclidean distance. From a divergence function, we can derive a Riemannian metric, which leads to a true distance between random variables, and in which best predictors do not coincide with their Euclidean counterparts. It is the purpose of this note to point out that there are many interesting ways of measuring distance between random variables, and to study the notion of best predictors that they lead to.