“…Rényi used the simplest set of postulates that characterize Shannon's entropy and introduced his own entropy and divergence measures (parameterized by its order α) that generalize the Shannon entropy and the KL divergence, respectively (Rényi, 1961). Moreover, the original Jensen-Rényi divergence (He, Hamza, & Krim, 2003) as well as the identically named divergence (Kluza, 2019) used in this letter are non-f -divergence generalizations of the Jensen-Shannon divergence. Traditionally, Rényi's entropy and divergence have had applications in a wide range of problems, including lossless data compression (Campbell, 1965;Courtade & Verdú, 2014;Rached, Alajaji, & Campbell, 1999), hypothesis testing (Csiszár, 1995;Alajaji, Chen, & Rached, 2004), error probability (Ben-Bassat & Raviv, 2006), and guessing (Arikan, 1996;Verdú, 2015).…”