1959
DOI: 10.1016/s0019-9958(59)90066-x
|View full text |Cite
|
Sign up to set email alerts
|

On the coding theorem and its converse for finite-memory channels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

1999
1999
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 42 publications
(28 citation statements)
references
References 3 publications
0
28
0
Order By: Relevance
“…Proof of C Shannon ≤ C S . The proof is similar to the one in [10], so we just outline the main steps.…”
Section: Resultsmentioning
confidence: 96%
See 1 more Smart Citation
“…Proof of C Shannon ≤ C S . The proof is similar to the one in [10], so we just outline the main steps.…”
Section: Resultsmentioning
confidence: 96%
“…For this, we observe that the asymptotic mean stationarity [13] of the output process makes it possible to apply tools from ergodic theory to establish the existence of the mutual information rate of our channel and further the asymptotic equipartition property of the output process. Another issue is to mix the "blocked" processes to obtain a stationary process achieving the information capacity, for which we find an adaptation of Feinstein's method [10] as a solution.…”
Section: Introductionmentioning
confidence: 99%
“…Subsequently H (Z n |Z n−k ) is given by Eq. (15). Putting everything together, we obtain the lemma.…”
Section: B the Second Termmentioning
confidence: 86%
“…With an abuse of notation, we therefore drop all subscripts and use p to denote the respective probability measure. Let W n = Y n , X The main proof is a modification of Feinstein's work [15], which is under the ergodic-theoretic framework. We exploit his construction of an SE probability measure from a finitedimensional probability measure.…”
Section: Appendix Bmentioning
confidence: 99%
See 1 more Smart Citation