1998
DOI: 10.1109/18.720546
|View full text |Cite
|
Sign up to set email alerts
|

The method of types [information theory]

Abstract: The method of types is one of the key technical tools in Shannon Theory, and this tool is valuable also in other fields. In this paper, some key applications will be presented in sufficient detail enabling an interested nonspecialist to gain a working knowledge of the method, and a wide selection of further applications will be surveyed. These range from hypothesis testing and large deviations theory through error exponents for discrete memoryless channels and capacity of arbitrarily varying channels to multiu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
68
0

Year Published

2002
2002
2020
2020

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 328 publications
(68 citation statements)
references
References 68 publications
(77 reference statements)
0
68
0
Order By: Relevance
“…The classical (conditional) Kullback-Leibler information (informational divergence or relative entropy) is denoted by D and entropy by H. 10,11,22 Specifically, for a probability distribution Q on X 2 , transition (or conditional) probabilities P (v|u), u, v ∈ X , and a probability distribution p on X , we define…”
Section: Main Results For Simple Casementioning
confidence: 99%
See 3 more Smart Citations
“…The classical (conditional) Kullback-Leibler information (informational divergence or relative entropy) is denoted by D and entropy by H. 10,11,22 Specifically, for a probability distribution Q on X 2 , transition (or conditional) probabilities P (v|u), u, v ∈ X , and a probability distribution p on X , we define…”
Section: Main Results For Simple Casementioning
confidence: 99%
“…8, which employed the method of types 10,11,22,33 . In the present case, second-order (Markov) types rather than the usual types are used.…”
Section: B the Methods Of Typesmentioning
confidence: 99%
See 2 more Smart Citations
“…(9) below] to deduce a lower bound on the quantum capacity. This work's approach resembles theirs in that both bounds are shown using random coding arguments [6,7,13] based on (9), but differs from [11] in that while [11] uses an analog of minimum Hamming distance decoding, this work employs an analog of minimum entropy decoding [14] together with the method of types from classical information theory [7,14,15], which enables us to obtain the exponential bound analogous to (1) in a simple enumerative manner.…”
Section: Introductionmentioning
confidence: 99%