“…The Kullback-Leibler divergence introduced in [1] is used for quantification of similarity of two probability measures. It plays important role in various domains such as statistical inference (see, e.g., [2,3]), metric learning [4,5], machine learning [6,7], computer vision [8,9], network security [10], feature selection and classification [11][12][13], physics [14], biology [15], medicine [16,17], finance [18], among others. It is worth to emphasize that mutual information, widely used in many research directions (see, e.g., [19][20][21][22][23]), is a special case of the Kullback-Leibler divergence for certain measures.…”