If S is an infinite sequence over a finite alphabet Σ and β is a probability measure on Σ, then the dimension of S with respect to β, written dim β (S), is a constructive version of Billingsley dimension that coincides with the (constructive Hausdorff) dimension dim(S) when β is the uniform probability measure. This paper shows that dim β (S) and its dual Dim β (S), the strong dimension of S with respect to β, can be used in conjunction with randomness to measure the similarity of two probability measures α and β on Σ. Specifically, we prove that the divergence formulaholds whenever α and β are computable, positive probability measures on Σ and R ∈ Σ ∞ is random with respect to α. In this formula, H(α) is the Shannon entropy of α, and D(α||β) is the Kullback-Leibler divergence between α and β. We also show that the above formula holds for all sequences R that are α-normal (in the sense of Borel) when dim β (R) and Dim β (R) are replaced by the more effective finite-state dimensions dim β FS (R) and Dim FS β (R). In the course of proving this, we also prove finite-state compression characterizations of dim β FS (S) and Dim FS β (S).