Abstract-This paper develops a new understanding of mean shift algorithms from an information theoretic perspective. We show that the Gaussian Blurring Mean Shift (GBMS) directly minimizes the Renyi's quadratic entropy of the dataset and hence is unstable by definition. Further, its stable counterpart, the Gaussian Mean Shift (GMS), minimizes the Renyi's "cross" entropy where the local stationary solutions are modes of the dataset. By doing so, we aptly answer the question "What does mean shift algorithms optimize?", thus highlighting naturally the properties of these algorithms. A consequence of this new understanding is the superior performance of GMS over GBMS which we show in a wide variety of applications ranging from mode finding to clustering and image segmentation.Index Terms-Mean shift, information theoretic learning, Renyi's entropy.