2020
DOI: 10.1109/tit.2019.2937318
|View full text |Cite
|
Sign up to set email alerts
|

On the Conditional Smooth Rényi Entropy and its Applications in Guessing and Source Coding

Abstract: A novel definition of the conditional smooth Rényi entropy, which is different from that of Renner and Wolf, is introduced. It is shown that our definition of the conditional smooth Rényi entropy is appropriate to give lower and upper bounds on the optimal guessing moment in a guessing problem where the guesser is allowed to stop guessing and declare an error. Further a general formula for the optimal guessing exponent is given. In particular, a single-letterized formula for mixture of i.i.d. sources is obtain… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
49
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(49 citation statements)
references
References 28 publications
0
49
0
Order By: Relevance
“…The motivation for this work is rooted in the diverse information-theoretic applications of Rényi measures [62]. These include (but are not limited to) asymptotically tight bounds on guessing moments [1], informationtheoretic applications such as guessing subject to distortion [2], joint source-channel coding and guessing with application to sequential decoding [3], guessing with a prior access to a malicious oracle [14], guessing while allowing the guesser to give up and declare an error [50], guessing in secrecy problems [56], [75], guessing with limited memory [64], and guessing under source uncertainty [74]; encoding tasks [12], [13]; Bayesian hypothesis testing [8], [67], [79], and composite hypothesis testing [71], [77]; Rényi generalizations of the rejection sampling problem in [35], motivated by the communication complexity in distributed channel simulation, where these generalizations distinguish between causal and non-causal sampler scenarios [52]; Wyner's common information in distributed source simulation under Rényi divergence measures [87]; various other source coding theorems [15], [23], [24], [36], [49], [50], [68], [76], [78], [79], channel coding theorems [4], [5], [26], [60], [66], [78], [79], [86], including coding theorems in quantum information theory…”
Section: Introductionmentioning
confidence: 99%
“…The motivation for this work is rooted in the diverse information-theoretic applications of Rényi measures [62]. These include (but are not limited to) asymptotically tight bounds on guessing moments [1], informationtheoretic applications such as guessing subject to distortion [2], joint source-channel coding and guessing with application to sequential decoding [3], guessing with a prior access to a malicious oracle [14], guessing while allowing the guesser to give up and declare an error [50], guessing in secrecy problems [56], [75], guessing with limited memory [64], and guessing under source uncertainty [74]; encoding tasks [12], [13]; Bayesian hypothesis testing [8], [67], [79], and composite hypothesis testing [71], [77]; Rényi generalizations of the rejection sampling problem in [35], motivated by the communication complexity in distributed channel simulation, where these generalizations distinguish between causal and non-causal sampler scenarios [52]; Wyner's common information in distributed source simulation under Rényi divergence measures [87]; various other source coding theorems [15], [23], [24], [36], [49], [50], [68], [76], [78], [79], channel coding theorems [4], [5], [26], [60], [66], [78], [79], [86], including coding theorems in quantum information theory…”
Section: Introductionmentioning
confidence: 99%
“…For future work we are very interested in further results based on other (additive) entropies, such as Rényi entropies where other guessing bounds are already investigated [19] past their original use in moment inequalities [25][26][27] and other derived problems such as guessing with limited (or no) memory [28].…”
Section: Discussionmentioning
confidence: 99%
“…Remark 3. The quantity E[ log ( ) ] that appears on the left-hand side of (52) is closely related to the fundamental limits of the guessing problem [37], [38] allowing errors [39] for a source ; see [34, Section IV].…”
Section: B On the Cutoff Operation For Logarithm Of Integer-valued Random Variablementioning
confidence: 99%