2002
DOI: 10.1016/s0304-3975(01)00033-0
|View full text |Cite
|
Sign up to set email alerts
|

Conditional complexity and codes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
37
0
1

Year Published

2004
2004
2017
2017

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 57 publications
(38 citation statements)
references
References 3 publications
0
37
0
1
Order By: Relevance
“…These two definitions coincide for since up to an additive constant term. We investigate the maximal overlap of information (Theorem 3.1) which for specializes to Theorem 3.4 in [3], Corollary 3.2 shows (I.1) and Corollary 3.3 shows that the LHS of (I.2) can be taken to correspond to a single program embodying the "most comprehensive object that contains the most information about all the others" as stated but not argued or proved in [27]; metricity (Theorem 4.1) and universality (Theorem 5.2) which for (for metricity) and (for universality) specialize to Theorem 4.2 in [3]; additivity (Theorem 6.1); minimum overlap of information (Theorem 7.1) which for specializes to [29,Theorem 8.3.7] and the nonmetricity of normalized information distance for lists of more than two elements and certain proposals of the normalizing factor (Section VIII). In contrast, for lists of two elements we can normalize the information distance as in Lemma V.4 and Theorem V.7 of [26].…”
Section: B Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…These two definitions coincide for since up to an additive constant term. We investigate the maximal overlap of information (Theorem 3.1) which for specializes to Theorem 3.4 in [3], Corollary 3.2 shows (I.1) and Corollary 3.3 shows that the LHS of (I.2) can be taken to correspond to a single program embodying the "most comprehensive object that contains the most information about all the others" as stated but not argued or proved in [27]; metricity (Theorem 4.1) and universality (Theorem 5.2) which for (for metricity) and (for universality) specialize to Theorem 4.2 in [3]; additivity (Theorem 6.1); minimum overlap of information (Theorem 7.1) which for specializes to [29,Theorem 8.3.7] and the nonmetricity of normalized information distance for lists of more than two elements and certain proposals of the normalizing factor (Section VIII). In contrast, for lists of two elements we can normalize the information distance as in Lemma V.4 and Theorem V.7 of [26].…”
Section: B Resultsmentioning
confidence: 99%
“…Now let and be arbitrary strings of length at most . Muchnik [29], see also the textbook [28,Theorem 8.3.7], shows that there exists a shortest program that converts to (that is, and ), such that is simple with respect to and therefore depends little on the origin , that is, . This is a fundamental coding property for individual strings that parallels related results about random variables known as the Slepian-Wolf and Csiszár-Körner-Marton theorems [11].…”
Section: Theorem 52mentioning
confidence: 99%
See 1 more Smart Citation
“…Inequalities (1) and (3) could be rephrased in the Kolmogorov complexity setting; but the natural counterparts of these inequalities prove to be not valid for Kolmogorov complexity. The proof of this fact is very similar to the argument in Theorem 2 (we need to use Muchik's theorem on conditional descriptions [14] instead of the Slepian-Wolf theorem employed in Shannon's framework). We skip details for the lack of space.…”
Section: Conclusion and Discussionmentioning
confidence: 86%
“…The transmission request graph has the corresponding edge deleted ( Fig. 3): Muchnik noted [9] that the condition K(A|B) ≤ k Here is the exact statement of Muchnik's theorem: Let A and B be arbitrary strings of complexity at most n. Then there exists a string X of length…”
Section: A Nontrivial Example: Muchnik's Theoremmentioning
confidence: 99%