1997
DOI: 10.1201/9781420049763.ch73
|View full text |Cite
|
Sign up to set email alerts
|

Information Theory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
76
0
2

Year Published

2001
2001
2017
2017

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 51 publications
(78 citation statements)
references
References 19 publications
0
76
0
2
Order By: Relevance
“…the average density of states (DOS) of U and the average distribution of a set of complex numbers {b z } that determine the asymptotics of the eigenstates, when u(t) is drawn from a zero-mean, δ-correlated Gaussian distribution, describing the distribution of transmitted codewords. Gaussian input signals are often used in information theory, and in linear transmission problems they often reach the Shannon capacity [7]. In addition, when the characteristic signal amplitude u 0 is much smaller than its bandwidth τ −1 (but with D ≡ u 2 0 τ arbitrary), it is reasonable to approximate [8] the input distribution with a δ-correlated Gaussian for eigenvalues z small in the scale of τ −1 .…”
Section: Introductionmentioning
confidence: 99%
“…the average density of states (DOS) of U and the average distribution of a set of complex numbers {b z } that determine the asymptotics of the eigenstates, when u(t) is drawn from a zero-mean, δ-correlated Gaussian distribution, describing the distribution of transmitted codewords. Gaussian input signals are often used in information theory, and in linear transmission problems they often reach the Shannon capacity [7]. In addition, when the characteristic signal amplitude u 0 is much smaller than its bandwidth τ −1 (but with D ≡ u 2 0 τ arbitrary), it is reasonable to approximate [8] the input distribution with a δ-correlated Gaussian for eigenvalues z small in the scale of τ −1 .…”
Section: Introductionmentioning
confidence: 99%
“…Finally, our analysis employs ideas from the classical communication theory [11,32,33]. But whereas source and channel coding attempt to represent the information in an efficient format for transmission, cryptographic engineers have the opposite goal to make their circuit's internal configurations unintelligible to the outside world.…”
Section: Introductionmentioning
confidence: 99%
“…This proposition is well known (Varshamov-Gilbert lemma, see [7]). We give here a sketch of proof for the reader's convenience.…”
Section: Proof Of Propositionmentioning
confidence: 84%