Abstract:One of the most significant impediments to the use of LDPC codes in many communication and storage systems is the error-rate floor phenomenon associated with their iterative decoders. The error floor has been attributed to certain sub-graphs of an LDPC code's Tanner graph induced by so-called trapping sets. We show in this paper that once we identify the trapping sets of an LDPC code of interest, a sum-product algorithm (SPA) decoder can be custom-designed to yield floors that are orders of magnitude lower tha… Show more
“…Other methods such as interleaver design, modifying the sum-product algorithm and adding an outer code [14] are also applicable to GIRA codes and may further improve their error floor performance.…”
Section: Resultsmentioning
confidence: 99%
“…3 ), a constant combiner rate 2 (ρ = x), interleaver Π = [1,4,5,11,2,7,8,12,6,13,9,14,3,10]. (12)- (14), decoded with sumproduct decoding with a maximum of 1000 iterations.…”
Abstract-In this paper, we present a new class of iteratively decoded error correction codes. These codes, which are a modification of irregular repeat-accumulate (IRA) codes, are termed generalized IRA (GIRA) codes, and are designed for improved error floor performance. GIRA codes are systematic, easily encodable, and are decoded with the sum-product algorithm. In this paper we present a density evolution algorithm to compute the threshold of GIRA codes, and find GIRA degree distributions which produce codes with good thresholds. We then propose inner code designs and show using simulation results that they improve upon the error floor performance of IRA codes.
“…Other methods such as interleaver design, modifying the sum-product algorithm and adding an outer code [14] are also applicable to GIRA codes and may further improve their error floor performance.…”
Section: Resultsmentioning
confidence: 99%
“…3 ), a constant combiner rate 2 (ρ = x), interleaver Π = [1,4,5,11,2,7,8,12,6,13,9,14,3,10]. (12)- (14), decoded with sumproduct decoding with a maximum of 1000 iterations.…”
Abstract-In this paper, we present a new class of iteratively decoded error correction codes. These codes, which are a modification of irregular repeat-accumulate (IRA) codes, are termed generalized IRA (GIRA) codes, and are designed for improved error floor performance. GIRA codes are systematic, easily encodable, and are decoded with the sum-product algorithm. In this paper we present a density evolution algorithm to compute the threshold of GIRA codes, and find GIRA degree distributions which produce codes with good thresholds. We then propose inner code designs and show using simulation results that they improve upon the error floor performance of IRA codes.
“…Figure 1 shows an error TS(4,2) of an LDPC code with a degree of bit nodes and check nodes (3,6) in which the LLR value of BN is computed by the sum of intrinsic information and extrinsic information from three related CN.…”
Section: Bpa and G-ldpc Decodersmentioning
confidence: 99%
“…It was observed through simulations of many LDPC codes that, in the error floor region, a frame error event usually contains a single TS. The authors in [3,4] provide the method of combination of CN into Super Check Nodes (SCN), so that information from the error-free TS can correct the error TS. For example, the Margulis code introduced in [8] with 1320 TS and 1320 TS (Figure 1) are the most dominant TS in the error floor region of the Margulis code.…”
Section: Bpa and G-ldpc Decodersmentioning
confidence: 99%
“…To reduce the influence of TS, in [3,4], the authors gave a solution which is to find the most dominant TS and propose the G-LDPC decoder (Generalized LDPC decoder), allowing improvement in decoding quality at the high E b /N 0 values (the error floor region). However, defining TS is complex and difficult to implement with great length LDPC codes.…”
Abstract-The article introduces a new decoder for LDPC codes based on the general check matrix and soft syndrome. Simulation result shows that the new decoder can improve the performance of LDPC codes. Compared with some other improvements, the new decoding algorithm is simpler, and it can detect errors and be applied to great length LDPC codes.
In this paper, we propose a new approach to construct a class of check‐hybrid generalized low‐density parity‐check (CH‐GLDPC) codes, which are free of small trapping sets. The approach is based on converting some selected check nodes involving a trapping set into super checks corresponding to a 2‐error‐correcting component code. Specifically, we follow 2 main purposes to construct the check‐hybrid codes; first, on the basis of the knowledge of trapping sets of an LDPC code, single parity checks are replaced by super checks to disable the trapping sets. We show that by converting specified single check nodes, denoted as critical checks, to super checks in a trapping set, the parallel bit flipping decoder corrects the errors on a trapping set. The second purpose is to minimize the rate loss through finding the minimum number of such critical checks. We also present an algorithm to find critical checks in a trapping set of a column‐weight 3 LDPC code of girth 8 and then provide upper bounds on the minimum number of such critical checks such that the decoder corrects all error patterns on elementary trapping sets. Guaranteed error correction capability of the CH‐GLDPC codes is also studied. We show that a CH‐GLDPC code in which each variable node is connected to 2 super checks corresponding to a 2‐error‐correcting component code corrects up to 5 errors. The results are also extended to column‐weight 4 LDPC codes of girth 6. Finally, we investigate eliminating of trapping sets of a column‐weight 3 LDPC code of girth 8 using the Gallager B decoding algorithm.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.