A Lagrangian stochastic model may sometimes generate velocities of unrealistic magnitude, and several authors have taken ad hoc steps to control their impact. The occurrence of these "rogue velocities" would, on first sight, appear to contradict the model's being well mixed; however, the problem is typically experienced in the context of a complex (and in some cases, discontinuous, i.e., gridded) regime of turbulence, corresponding to which there may (implicitly) be limitations on the allowable size of the time step that had not been respected. This article seeks to observe rogue velocities in an artificial regime of 1-D turbulence, in which two regions of differing constant velocity variance are joined by a ramp such that the gradient in velocity variance is discontinuous. The evolution of an initially well-mixed distribution of tracer is computed by integrating the Chapman-Kolmogorov equation (the resulting calculation is compared with the corresponding stochastic solution). It is found that unless the time step is small relative to an inhomogeneity time scale implied by the field of Eulerian velocity statistics, the well-mixed condition is violated: large velocities occur with greater probability than they ought.