Kullback-Leibler (KL) divergence is one of the most important divergence measures between probability distributions. In this paper, we investigate the properties of KL divergence between Gaussians. Firstly, for any two n-dimensional Gaussians N 1 and N 2 , we find the supremum of KL(N 1 ||N 2 ) when KL(N 2 ||N 1 ) ≤ for > 0. This reveals the approximate symmetry of small KL divergence between Gaussians. We also find the infimum of KL(N 1 ||N 2 ) when KL(N 2 ||N 1 ) ≥ M for M > 0. Secondly, for any three n-dimensional Gaussians N 1 , N 2 and N 3 , we find a bound of KL(N 1 ||N 3 ) if KL(N 1 ||N 2 ) and KL(N 2 ||N 3 ) are bounded. This reveals that the KL divergence between Gaussians follows a relaxed triangle inequality. Importantly, all the bounds in the theorems presented in this paper are independent of the dimension n.