2011 49th Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2011
DOI: 10.1109/allerton.2011.6120166
|View full text |Cite
|
Sign up to set email alerts
|

The dispersion of joint source-channel coding

Abstract: In this work we investigate the behavior of the distortion threshold that can be guaranteed in joint sourcechannel coding, to within a prescribed excess-distortion probability. We show that the gap between this threshold and the optimal average distortion is governed by a constant that we call the joint source-channel dispersion. This constant can be easily computed, since it is the sum of the source and channel dispersions, previously derived. The resulting performance is shown to be better than that of any s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
74
0

Year Published

2011
2011
2017
2017

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 46 publications
(76 citation statements)
references
References 9 publications
2
74
0
Order By: Relevance
“…and the derivative is bounded; We note that similar regularity assumptions were made in other works on second-order asymptotics for lossy source coding [27] and lossy joint source-channel coding [42].…”
Section: B General Discrete Memoryless Sourcesmentioning
confidence: 53%
“…and the derivative is bounded; We note that similar regularity assumptions were made in other works on second-order asymptotics for lossy source coding [27] and lossy joint source-channel coding [42].…”
Section: B General Discrete Memoryless Sourcesmentioning
confidence: 53%
“…Non-asymptotic achievability and converse bounds for a graph-theoretic model of JSCC have been obtained by Csiszár [16]. Most recently, Tauste Campo et al [17] showed a number of finite-blocklength random-coding bounds applicable to the almost-lossless JSCC setup, while Wang et al [18] found the dispersion of JSCC for sources and channels with finite alphabets.…”
Section: Introductionmentioning
confidence: 99%
“…The direct part of the proof of the theorem in the original Wyner-Ziv paper [4] is based on the average fidelity criterion in (24). It relies on the compress-bin idea.…”
Section: B First-order Results For the Wz Problemmentioning
confidence: 99%
“…The pre-factor of this term S (V, ε), is likened to the dispersion [22], [24]- [26], and depends not only the variances of the information and entropy densities but also their correlations.…”
Section: A Main Contributionsmentioning
confidence: 99%