1996
DOI: 10.1109/18.532875
|View full text |Cite
|
Sign up to set email alerts
|

Source coding and graph entropies

Abstract: A sender wants to accurately convey information to a receiver who has some, possibly related, data. We study the expected number of bits the sender must transmit for one and for multiple instances in two communication scenarios and relate this number to the chromatic and Komer entropies of a naturally defined graph.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
189
0

Year Published

2005
2005
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 141 publications
(190 citation statements)
references
References 16 publications
1
189
0
Order By: Relevance
“…Clearly, Corollary 1 and the aforementioned examples show that in the interior of the simplex of distributions on with a discontinuity at its boundary. While discontinuities of this type are well known to arise in problems such as zeroerror channel coding [30] and the zero-error Slepian-Wolf problem [40], [1], [19], it is interesting to see it arise in our setting, which assumes the standard "near-lossless" formulation. This discontinuity was observed also in [18], whose " -chromatic" number is also defined through a near-lossless source coding problem.…”
Section: Remarksmentioning
confidence: 99%
“…Clearly, Corollary 1 and the aforementioned examples show that in the interior of the simplex of distributions on with a discontinuity at its boundary. While discontinuities of this type are well known to arise in problems such as zeroerror channel coding [30] and the zero-error Slepian-Wolf problem [40], [1], [19], it is interesting to see it arise in our setting, which assumes the standard "near-lossless" formulation. This discontinuity was observed also in [18], whose " -chromatic" number is also defined through a near-lossless source coding problem.…”
Section: Remarksmentioning
confidence: 99%
“…Witsenhausen showed that fixed-length side information codes were equivalent to colorings of a related object called the confusion graph, and thus the logarithm of the chromatic number of this graph tightly characterizes the minimum number of bits needed to encode the source. Further results by Alon and Orlitsky [2] and Koulgi et al [12] showed that graph-theoretic information measures could be used to characterize both the average length of variable-length codes, as well as asymptotic rates of codes that simultaneously encode multiple inputs drawn from the same source.…”
Section: Introductionmentioning
confidence: 99%
“…Upper and lower bounds on L Y (X) in terms of the entropy of the optimal coloring are given in [2]. Finding a single-…”
Section: Preliminariesmentioning
confidence: 99%
“…We will focus only on (x, y) pairs with P (x, y) > 0 and thus restrict attention only to restricted inputs (RI) protocols, as defined in [2]. A protocol for transmitting X when the decoder knows Y , henceforth referred to as an RI protocol, is defined to be a mapping φ : X → {0, 1} * such that if x and x are confusable then φ(x) is neither equal to, nor a prefix of φ(x ).…”
Section: Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation