Information entropy and its extension, which are important generalizations of entropy, are currently applied to many research domains. In this paper, a novel generalized relative entropy is constructed to avoid some defects of traditional relative entropy. We present the structure of generalized relative entropy after the discussion of defects in relative entropy. Moreover, some properties of the provided generalized relative entropy are presented and proved. The provided generalized relative entropy is proved to have a finite range and is a finite distance metric. Finally, we predict nucleosome positioning of fly and yeast based on generalized relative entropy and relative entropy respectively. The experimental results show that the properties of generalized relative entropy are better than relative entropy.Keywords: relative entropy; generalized relative entropy; upper bound; distance metric; adjusted distance
BackgroundThe concept of entropy was proposed by T. Clausius as one of the parameters to reflect the degree of chaos for the object. Later, research found that information was such an abstract concept that was hard to make it clear to obtain its amount. Indeed, it was not until the information entropy was proposed by Shannon that we had a standard measure for the amount of information. Then, some related concepts based on information entropy have been proposed subsequently, such as cross entropy, relative entropy and mutual information, which offered an effective method to solve the complex problems of information processing. Therefore, the study of a novel metric based on information entropy was significant in the research domain of information science.Information entropy was first proposed by Shannon. Assuming an information source I is composed by n different signals I i , H(I), the information entropy of I was shown in Equation (1), where p i = amount of I i signal s amount of I denotes frequency of I i , E() means mathematical expectation, k > 1 denotes the base of logarithm. When k = 2, the unit of H(I) is bit.Information entropy was a metric of the chaos degree for an information source. The bigger the information entropy was, the more chaotic the information source, and vice versa. Afterwards cross entropy was proposed based on information entropy, the definition was shown in Equation (2) where P