2022
DOI: 10.3390/e24101417
|View full text |Cite
|
Sign up to set email alerts
|

Rényi Cross-Entropy Measures for Common Distributions and Processes with Memory

Abstract: Two Rényi-type generalizations of the Shannon cross-entropy, the Rényi cross-entropy and the Natural Rényi cross-entropy, were recently used as loss functions for the improved design of deep learning generative adversarial networks. In this work, we derive the Rényi and Natural Rényi differential cross-entropy measures in closed form for a wide class of common continuous distributions belonging to the exponential family, and we tabulate the results for ease of reference. We also summarise the Rényi-type cross-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 19 publications
0
1
0
Order By: Relevance
“…Following the same path, we would like to provide a measure in the case of an exponential average. While Rényi himself defined a generalized D KL [26], further analyzed in [27] and [28], and different definitions of a generalized crossentropy exist [29], we would like to define such quantities in the framework of data compression. In particular, the exponential average codeword length when r is used to perform the compression is given by:…”
Section: A Note On the Estimation Of The Source Probability Distributionmentioning
confidence: 99%
“…Following the same path, we would like to provide a measure in the case of an exponential average. While Rényi himself defined a generalized D KL [26], further analyzed in [27] and [28], and different definitions of a generalized crossentropy exist [29], we would like to define such quantities in the framework of data compression. In particular, the exponential average codeword length when r is used to perform the compression is given by:…”
Section: A Note On the Estimation Of The Source Probability Distributionmentioning
confidence: 99%