“…For more information about other applications of Shannon entropy, see Asadi et al, Cover and Thomas, Ebrahimi et al, among others. Dynamic and bivariate versions can be seen in Asadi et al, Chamany and Baratpour, Jomhoori and Yousefzadeh, Navarro et al, and references therein. Another useful measure to obtain the distance between 2 density functions f and g is the Kullback‐Leibler (KL) distance, defined by where is known as “Fraser information” (Kent) and is also known as “inaccuracy measure” (Kerridge).…”