2008
DOI: 10.1007/s00453-008-9192-0
|View full text |Cite
|
Sign up to set email alerts
|

Random Matrices and Codes for the Erasure Channel

Abstract: The design of erasure correcting codes and their decoding algorithms is now at the point where capacity achieving codes are available with decoding algorithms that have complexity that is linear in the number of information symbols. One aspect of these codes is that the overhead (number of coded symbols beyond the number of information symbols required to achieve decoding completion with high probability) is linear in k. This work considers a new class of random codes which have the following advantages: (i) t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
49
0

Year Published

2008
2008
2017
2017

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 32 publications
(50 citation statements)
references
References 10 publications
1
49
0
Order By: Relevance
“…For the case of a single packet transmission (denoted above as "single"), we demonstrate the gain of the adaptive coding scheme by showing performance of different coding implementations: fixed RS (F-RS) and punctured RS (P-RS) with coded word of 8 b, Fountain [24], and Raptor codes using an RS outer code [we note that similar values were obtained for Raptor codes using a low-density parity check (LDPC) outer code]. As performance of the RS scheme varies greatly with the erasure patterns, we show performance results for both independent identically distributed (i.i.d.)…”
Section: Performance Results and Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…For the case of a single packet transmission (denoted above as "single"), we demonstrate the gain of the adaptive coding scheme by showing performance of different coding implementations: fixed RS (F-RS) and punctured RS (P-RS) with coded word of 8 b, Fountain [24], and Raptor codes using an RS outer code [we note that similar values were obtained for Raptor codes using a low-density parity check (LDPC) outer code]. As performance of the RS scheme varies greatly with the erasure patterns, we show performance results for both independent identically distributed (i.i.d.)…”
Section: Performance Results and Discussionmentioning
confidence: 99%
“…However, since in underwater acoustic communication, transmission rate is on the order of a few kilobits per second and packets are small [23], we expect to be on the order of a few tens to thousand symbols. Therefore, since popular LT and Raptor codes perform well only for large code word lengths, for our numerical results, we apply the Fountain coding scheme described in [24], where good performance results are obtained for information word lengths as low as , at the cost of somewhat increased decoding complexity. When the channel cannot be modeled as an erasure channel, the integration of newly arrived samples into message-passing decoding is somewhat more complicated.…”
Section: Methodsmentioning
confidence: 99%
“…A.3.3) that a real-valued matrix filled with independent and identically distributed random variables, with continuous probability distribution function, will be singular with probability zero. On the contrary, in GF (2) the probability a square random binary matrix is singular as its dimension tends to infinity is 71.1 [15]. The scheme proposed in [13] stated that the random matrices A and B must be singular without specifying the value of the ranks.…”
Section: Recovering the Original Image From A Single Sharementioning
confidence: 99%
“…The performance issues described above lead to the question of whether it is possible to get better intermediate decoding performance, and hence better delay performance, by changing some of the constraints. In Studholme and Blake [11] it was shown that by changing the class of degree distributions allowed and the decoding algorithm, it is possible to decode all the uncoded packets with almost constant overhead irrespective of the block length. In this paper we complement this result by showing, with a modification of a known code construction, that changing the class of degree distributions allowed and the decoding algorithm, allows one to also achieve optimal intermediate decoding universally, i.e., for any given r ∈ [0, 1], (asymptotically) the decoder does not need to receive a linear excess of coded packets over rn .…”
Section: A Background Materials and Motivationmentioning
confidence: 99%