Abstract-We compare the performance of short-length linear binary codes on the binary erasure channel and the binaryinput Gaussian channel. We use a universal decoder that can decode any linear binary block code: Gaussian-elimination based Maximum-Likelihood decoder on the erasure channel and probabilistic Ordered Statistics Decoder on the Gaussian channel. As such we compare codes and not decoders. The word error rate versus the channel parameter is found for LDPC, Reed-Muller, Polar, and BCH codes at length 256 bits. BCH codes outperform other codes in absence of cyclic redundancy check. Under joint decoding, the concatenation of a cyclic redundancy check makes all codes perform very close to optimal lower bounds.
We compare the performance of a selection of short-length and very short-length linear binary error-correcting codes on the binary-input Gaussian noise channel, and on the fast and quasi-static flat Rayleigh fading channel. We use the probabilistic Ordered Statistics Decoder, that is universal to any code construction. As such we compare codes and not decoders. The word error rate versus the signal-to-noise ratio is found for LDPC, Reed-Muller, Polar, Turbo, Golay, random, and BCH codes at length 20, 32 and 256 bits. BCH and random codes outperform other codes in absence of a cyclic redundancy check concatenation. Under joint decoding, the concatenation of a cyclic redundancy check makes all codes perform very close to optimal lower bounds. Optimizations of the Ordered Statistics Decoder are discussed and revealed to bring near-ML performance with a notable complexity reduction, making the decoding complexity at very short length affordable.
Constellation shaping is necessary to approach channel capacity for information rates above 1 bit/dim. Probabilistic shaping shows a small gap to capacity, however a complex distribution matcher is required to modify the source distribution. Spherical shaping of lattice constellations also reduces the gap to capacity, but practical Voronoi shaping is feasible in small dimensions only. In this paper, our codebook is a real geometrically nonuniform Gaussian-like constellation. We prove that this discrete codebook achieves channel capacity when the number of points goes to infinity. Then we build a special mapping to interface between non-binary low-density codes and the codebook, allowing the code alphabet size to be equal to the square root of the codebook size. Excellent performance is shown with fast-encoding and practical iterative probabilistic decoding, e.g. 0.7 dB gap to capacity at 6 bits/s/Hz with a code defined over the ring Z/8Z.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.