A new outlier-robust approach to estimate the magnitude squared coherence of a random vector sequence, a common task required in a variety of estimation and detection problems, is proposed. The proposed estimator is based on Rényi's entropy, an information theoretic kernel-based measure that proves to be inversely proportional to the determinant of a regularized version of the covariance matrix in the proper Gaussian case. The trade-off between accuracy and robustness in terms of bias and variance is analytically and numerically characterized, showing a dependence on the relative kernel bandwidth and the available data size.