2019
DOI: 10.1145/3356867
|View full text |Cite
|
Sign up to set email alerts
|

An Operational Characterization of Mutual Information in Algorithmic Information Theory

Abstract: We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For ℓ > 2, the longest shared secret that can be established from a tuple of strings (x1, ..., x ℓ ) by … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
21
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 11 publications
(21 citation statements)
references
References 35 publications
0
21
0
Order By: Relevance
“…In case we use Kolmogorov complexity to measure the amount of information, I(x : y) is defined as C(x)+C(y)−C(x, y), and, up to logarithmic precision, is also equal to C(x)−C(x | y) and to C(y) − C(y | x). It is shown in [RZ18] (extending a classical result from [AC93] which is valid for inputs generated by memoryless processes, and which is using Shannon entropy to measure information), that no computable protocol (even probabilistic) can obtain a shared secret key longer than the mutual information of the inputs x and y. On the other hand, a protocol is presented in [RZ18] that with high probability produces a shared secret of length I(x : y) (up to logarithmic precison), provided, as mentioned above, the two parties know the complexity profile of the inputs.…”
Section: Introductionmentioning
confidence: 99%
See 4 more Smart Citations
“…In case we use Kolmogorov complexity to measure the amount of information, I(x : y) is defined as C(x)+C(y)−C(x, y), and, up to logarithmic precision, is also equal to C(x)−C(x | y) and to C(y) − C(y | x). It is shown in [RZ18] (extending a classical result from [AC93] which is valid for inputs generated by memoryless processes, and which is using Shannon entropy to measure information), that no computable protocol (even probabilistic) can obtain a shared secret key longer than the mutual information of the inputs x and y. On the other hand, a protocol is presented in [RZ18] that with high probability produces a shared secret of length I(x : y) (up to logarithmic precison), provided, as mentioned above, the two parties know the complexity profile of the inputs.…”
Section: Introductionmentioning
confidence: 99%
“…It is shown in [RZ18] (extending a classical result from [AC93] which is valid for inputs generated by memoryless processes, and which is using Shannon entropy to measure information), that no computable protocol (even probabilistic) can obtain a shared secret key longer than the mutual information of the inputs x and y. On the other hand, a protocol is presented in [RZ18] that with high probability produces a shared secret of length I(x : y) (up to logarithmic precison), provided, as mentioned above, the two parties know the complexity profile of the inputs. Thus, the above discussion suggests that it is natural to aim for a shared secret key whose length is equal to the mutual information of the inputs, for some concept of information that measures the detectable correlation.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations