2006
DOI: 10.1109/tit.2006.872974
|View full text |Cite
|
Sign up to set email alerts
|

Variable length coding over an unknown channel

Abstract: Abstract-Burnashev in 1976 gave an exact expression for the reliability function of a discrete memoryless channel (DMC) with noiseless feedback. A coding scheme that achieves this exponent needs, in general, to know the statistics of the channel. Suppose now that the coding scheme is designed knowing only that the channel belongs to a family of DMCs. Is there a coding scheme with noiseless feedback that achieves Burnashev's exponent uniformly over at a nontrivial rate? We answer the question in the affirmative… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
119
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
5
3

Relationship

1
7

Authors

Journals

citations
Cited by 80 publications
(122 citation statements)
references
References 11 publications
3
119
0
Order By: Relevance
“…A different definition for the compound channel with feedback could also be considered; for instance, in [16], the authors consider codes of variable blocklength and define achievability accordingly.…”
Section: Problem Statement and Main Resultsmentioning
confidence: 99%
“…A different definition for the compound channel with feedback could also be considered; for instance, in [16], the authors consider codes of variable blocklength and define achievability accordingly.…”
Section: Problem Statement and Main Resultsmentioning
confidence: 99%
“…Proposition 2 tells us that the distortion scales faster than max{ M , M We again condition on the value of S 0 , which gives the the same success probabilities as (29) and (31). Therefore the distortion with this one bit of feedback is also O(M −1/3 ).…”
Section: ) Example: Perfect Feedbackmentioning
confidence: 93%
“…(72)]. Now, we analyze E b (R, T ) given in (14). Note that there is no conceptual difference between E a (R, T ) and E b (R, T ), and it can be verified that the latter can be written as…”
Section: Proof Of Corollarymentioning
confidence: 99%
“…Indeed, as was mentioned before, on the one hand, as α increases, the decoder has better knowledge of the channel, even if it does not estimate it explicit. On the other hand, the number of available symbols (1 − α)n that are used to distinguish the M = e nR codewords from one another decreases 14 . Thus, we expect that, in general, ξ * (R, T, α,P X ) will be maximized by some α * ∈ (0, 1).…”
Section: Appendix B Universal Decoder With Trainingmentioning
confidence: 99%
See 1 more Smart Citation