We show, in the same vein of Simon's Wonderland Theorem, that, typically in Baire's sense, the rates with whom the solutions of the Schrödinger equation escape, in time average, from every finite-dimensional subspace, depend on subsequences of time going to infinite.