To facilitate applications in IoT, 5G, and beyond, there is an engineering need to enable high-rate, low-latency communications. Errors in physical channels typically arrive in clumps, but most decoders are designed assuming that channels are memoryless. As a result, communication networks rely on interleaving over tens of thousands of bits so that channel conditions match decoder assumptions. Even for short high rate codes, awaiting sufficient data to interleave at the sender and de-interleave at the receiver is a significant source of unwanted latency. Using existing decoders with non-interleaved channels causes a degradation in block error rate performance owing to mismatch between the decoder's channel model and true channel behaviour.Through further development of the recently proposed Guessing Random Additive Noise Decoding (GRAND) algorithm, which we call GRAND-MO for GRAND Markov Order, here we establish that by abandoning interleaving and embracing bursty noise, low-latency, short-code, high-rate communication is possible with block error rates that outperform their interleaved counterparts by a substantial margin. Moreover, while most decoders are twinned to a specific code-book structure, GRAND-MO can decode any code. Using this property, we establish that certain well-known structured codes are ill-suited for use in bursty channels, but Random Linear Codes (RLCs) are robust to correlated noise. This work suggests that the use of RLCs with GRAND-MO is a good candidate for applications requiring high throughput with low latency.
Modern applications are driving demand for ultrareliable low-latency communications, rekindling interest in the performance of short, high-rate error correcting codes. To that end, here we introduce a soft-detection variant of Guessing Random Additive Noise Decoding (GRAND) called Ordered Reliability Bits GRAND that can decode any moderate redundancy block-code. For a code of n bits, it avails of no more than log 2 (n) bits of code-book-independent quantized soft detection information per received bit to determine an accurate decoding while retaining the original algorithm's suitability for a highly parallelized implementation in hardware. ORBGRAND is shown to provide similar block error performance for codes of distinct classes (BCH, CA-Polar and RLC) with low complexity, while providing better block error rate performance than CA-SCL, a state of the art soft detection CA-Polar decoder.
To facilitate applications in IoT, 5G, and beyond, there is an engineering need to enable high-rate, low-latency communications. Errors in physical channels typically arrive in clumps, but most decoders are designed assuming that channels are memoryless. As a result, communication networks rely on interleaving over tens of thousands of bits so that channel conditions match decoder assumptions. Even for short high rate codes, awaiting sufficient data to interleave at the sender and de-interleave at the receiver is a significant source of unwanted latency. Using existing decoders with non-interleaved channels causes a degradation in block error rate performance owing to mismatch between the decoder's channel model and true channel behaviour.Through further development of the recently proposed Guessing Random Additive Noise Decoding (GRAND) algorithm, which we call GRAND-MO for GRAND Markov Order, here we establish that by abandoning interleaving and embracing bursty noise, low-latency, short-code, high-rate communication is possible with block error rates that outperform their interleaved counterparts by a substantial margin. Moreover, while most decoders are twinned to a specific code-book structure, GRAND-MO can decode any code. Using this property, we establish that certain well-known structured codes are ill-suited for use in bursty channels, but Random Linear Codes (RLCs) are robust to correlated noise. This work suggests that the use of RLCs with GRAND-MO is a good candidate for applications requiring high throughput with low latency.
CRC codes have long since been adopted in a vast range of applications. The established notion that they are suitable primarily for error detection can be set aside through use of the recently proposed Guessing Random Additive Noise Decoding (GRAND). Hard-detection (GRAND-SOS) and softdetection (ORBGRAND) variants can decode any short, high-rate block code, making them suitable for error correction of CRCcoded data. When decoded with GRAND, short CRC codes have error correction capability that is at least as good as popular codes such as BCH codes, but with no restriction on either code length or rate.The state-of-the-art CA-Polar codes are concatenated CRC and Polar codes. For error correction, we find that the CRC is a better short code than either Polar or CA-Polar codes. Moreover, the standard CA-SCL decoder only uses the CRC for error detection and therefore suffers severe performance degradation in short, high rate settings when compared with the performance GRAND provides, which uses all of the CA-Polar bits for error correction.Using GRAND, existing systems can be upgraded from error detection to low-latency error correction without re-engineering the encoder, and additional applications of CRCs can be found in IoT, Ultra-Reliable Low Latency Communication (URLLC), and beyond. The universality of GRAND, its ready parallelized implementation in hardware, and the good performance of CRC as codes make their combination a viable solution for low-latency applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.