Information entropy and its extension, which are important generalizations of entropy, are currently applied to many research domains. In this paper, a novel generalized relative entropy is constructed to avoid some defects of traditional relative entropy. We present the structure of generalized relative entropy after the discussion of defects in relative entropy. Moreover, some properties of the provided generalized relative entropy are presented and proved. The provided generalized relative entropy is proved to have a finite range and is a finite distance metric. Finally, we predict nucleosome positioning of fly and yeast based on generalized relative entropy and relative entropy respectively. The experimental results show that the properties of generalized relative entropy are better than relative entropy.Keywords: relative entropy; generalized relative entropy; upper bound; distance metric; adjusted distance
BackgroundThe concept of entropy was proposed by T. Clausius as one of the parameters to reflect the degree of chaos for the object. Later, research found that information was such an abstract concept that was hard to make it clear to obtain its amount. Indeed, it was not until the information entropy was proposed by Shannon that we had a standard measure for the amount of information. Then, some related concepts based on information entropy have been proposed subsequently, such as cross entropy, relative entropy and mutual information, which offered an effective method to solve the complex problems of information processing. Therefore, the study of a novel metric based on information entropy was significant in the research domain of information science.Information entropy was first proposed by Shannon. Assuming an information source I is composed by n different signals I i , H(I), the information entropy of I was shown in Equation (1), where p i = amount of I i signal s amount of I denotes frequency of I i , E() means mathematical expectation, k > 1 denotes the base of logarithm. When k = 2, the unit of H(I) is bit.Information entropy was a metric of the chaos degree for an information source. The bigger the information entropy was, the more chaotic the information source, and vice versa. Afterwards cross entropy was proposed based on information entropy, the definition was shown in Equation (2) where P
With the development of technologies such as multimedia technology and information technology, a great deal of video data is generated every day. However, storing and transmitting big video data requires a large quantity of storage space and network bandwidth because of its large scale. Therefore, the compression method of big video data has become a challenging research topic at present. Performance of existing content-based video sequence compression method is difficult to be effectively improved. Therefore, in this paper, we present a fractal-based parallel compression method without content for big video data. First of all, in order to reduce computational complexity, a video sequence is divided into several fragments according to the spatial and temporal similarity. Secondly, domain and range blocks are classified based on the color similarity feature to reduce computational complexity in each video fragment. Meanwhile, a fractal compression method is deployed in a SIMD parallel environment to reduce compression time and improve the compression ratio. Finally, experimental results show that the proposed method not only improves the quality of the recovered image but also improves the compression speed by compared with existing compression algorithms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.