2019
DOI: 10.1109/tit.2018.2890208
|View full text |Cite
|
Sign up to set email alerts
|

On the Capacity of the Peak Power Constrained Vector Gaussian Channel: An Estimation Theoretic Perspective

Abstract: This paper studies the capacity of an n-dimensional vector Gaussian noise channel subject to the constraint that an input must lie in the ball of radius R centered at the origin. It is known that in this setting the optimizing input distribution is supported on a finite number of concentric spheres. However, the number, the positions and the probabilities of the spheres are generally unknown. This paper characterizes necessary and sufficient conditions on the constraint R such that the input distribution suppo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
42
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

3
3

Authors

Journals

citations
Cited by 34 publications
(42 citation statements)
references
References 33 publications
0
42
0
Order By: Relevance
“…The first 1 The formal proof is incomplete as the argument in [8] uses a conjecture which presumes that the number of points in the optimal distribution increases by 1 as the amplitude constraint is relaxed. Proof of this fact was later shown to be true in [9]. approach uses the maximum entropy principle [11,Chapter 12] and upper bounds the output differential entropy, h(Y ), subject to some moment constraint [12].…”
Section: A Prior Workmentioning
confidence: 99%
See 3 more Smart Citations
“…The first 1 The formal proof is incomplete as the argument in [8] uses a conjecture which presumes that the number of points in the optimal distribution increases by 1 as the amplitude constraint is relaxed. Proof of this fact was later shown to be true in [9]. approach uses the maximum entropy principle [11,Chapter 12] and upper bounds the output differential entropy, h(Y ), subject to some moment constraint [12].…”
Section: A Prior Workmentioning
confidence: 99%
“…A suboptimal choice of an output distribution in the dual capacity expression results in an upper bound on the capacity [15], [16], [17], [18]. The third approach uses a characterization of the mutual information as an integral of the minimum mean square error (MMSE) [19], and leads to an upper bound by replacing the optimal estimator in the MMSE term by a suboptimal one [9]. As for the lower bounds on the capacity, the first one relevant to our setting, as mentioned above, was proposed by Shanon in [7] which was based on the entropy power inequality.…”
Section: A Prior Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Sharma and Shamai [6] extend the result of Smith, and argue that an equiprobable input on {±A} is optimal if and only if A ≤Ā ≈ 1.665. The proof of the result in [6], which generalizes to vector channels, is shown in [7].…”
Section: Introductionmentioning
confidence: 96%