2013
DOI: 10.1109/tit.2013.2258397
|View full text |Cite
|
Sign up to set email alerts
|

Error Exponent for Gaussian Channels With Partial Sequential Feedback

Abstract: This paper studies the error exponent of block coding over an additive white Gaussian noise channel where a fraction ( ) of the channel output symbols are revealed to the transmitter through noiseless feedback. If the code rate exceeds , where is the channel capacity, then the probability of decoding error cannot decay faster than exponentially with block length. However, if the code rate is below , the error probability can decrease faster than exponentially with the block length, as with full feedback ( ). T… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
17
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(17 citation statements)
references
References 17 publications
0
17
0
Order By: Relevance
“…For example, if x = (1, 2, 3, 4, 5) and s = (0, 1, 1, 0, 1), then x s = (2,3,5). The all-ones tuple is denoted 1, and if s is a binary tuple then s c stands for the componentwise difference 1 − s. The Euclidean norm and inner product are denoted · and ·, · .…”
Section: Notation and Preliminariesmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, if x = (1, 2, 3, 4, 5) and s = (0, 1, 1, 0, 1), then x s = (2,3,5). The all-ones tuple is denoted 1, and if s is a binary tuple then s c stands for the componentwise difference 1 − s. The Euclidean norm and inner product are denoted · and ·, · .…”
Section: Notation and Preliminariesmentioning
confidence: 99%
“…Different feedback models have been studied, such as ratelimited [1], noisy [2]- [4] or partial feedback [5]. In [6] we considered the case introduced in [7] where the feedback is "intermittent," i.e., where each channel output is fed back with probability ρ.…”
Section: Introductionmentioning
confidence: 99%
“…This can only decrease the probability of error. It is shown in [4] that when a fixed fraction f of the output symbols is fed back to the transmitter (the positions of the symbols that are fed back are fixed and known to the transmitter and receiver), and R is larger than f times the capacity, then the probability of error cannot decay faster than exponentially. To apply this result here, let K denote the event that the number of symbols fed back in the first n channel uses is no more than (ρ + ε)n, i.e.,…”
Section: Positive Ratesmentioning
confidence: 99%
“…Our model is more pessimistic than the one in [4], and indeed our converse for the positive-rate case is based on the converse in [4]. In this paper, most of the effort goes into the achievability proofs.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation