2015
DOI: 10.1007/978-3-319-27239-9_16
|View full text |Cite
|
Sign up to set email alerts
|

Shannon Entropy Versus Renyi Entropy from a Cryptographic Viewpoint

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 21 publications
0
9
0
Order By: Relevance
“…The relation between the results of evaluating a cryptosystem using Shannon entropy and collision entropy was studied by Skorski [ 18 ], and the worst possible collision entropy for random variables with a given Shannon entropy was calculated.…”
Section: Entropy Measures and Related Conceptsmentioning
confidence: 99%
“…The relation between the results of evaluating a cryptosystem using Shannon entropy and collision entropy was studied by Skorski [ 18 ], and the worst possible collision entropy for random variables with a given Shannon entropy was calculated.…”
Section: Entropy Measures and Related Conceptsmentioning
confidence: 99%
“…, P k ] b−1 , for some elements x there will be many patterns that give x = g (a,as,b) , while for other elements x there will be a few. To formalize this discussion we adapt the approach that Smith proposed in [22] about the Shannon entropy of completely partitioned sets, and the relations between Shannon entropy and several instances of Rényi entropy (such as collision entropy and min-entropy) studied by Cachin in [23] (see also the work of Skórski [24]).…”
Section: Dichotomy Between Odd and Even Basesmentioning
confidence: 99%
“…Observe that A cannot be considered a stochastic, or conditional probability, matrix, this is, in general ∑ j a ij = ∑ j t ij /y j = 1. However, the transposed matrix A is a stochastic matrix, and, by construction, it defines a Markov chain with stationary distribution y, this is, y = yA , Equation (13).…”
Section: Sam Coefficient Matrix As a Markov Chainmentioning
confidence: 99%
“…The minimum value of is 0, when for some x , = 1 and all other probabilities are thus 0. Thus, entropy can be considered too a measure of homogeneity or uniformity of a distribution [ 13 ] or a diversity index [ 14 ], the higher its value the more homogeneous is the distribution and vice versa.…”
Section: Information Measures and Information Channelmentioning
confidence: 99%