2022
DOI: 10.1109/tcomm.2022.3218821
|View full text |Cite
|
Sign up to set email alerts
|

Decoding Short LDPC Codes via BP-RNN Diversity and Reliability-Based Post-Processing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 38 publications
0
5
0
Order By: Relevance
“…Finally, OSD-p post-processing is applied after a decoding diversity of 3 BP-RNNs, denoted D 3 . The construction method of this decoding diversity is detailed in [13]. For each BP-RNN of D 3 , a single neuron is optimized with the parameters described in section IV-A.…”
Section: B Ccsds Codementioning
confidence: 99%
See 1 more Smart Citation
“…Finally, OSD-p post-processing is applied after a decoding diversity of 3 BP-RNNs, denoted D 3 . The construction method of this decoding diversity is detailed in [13]. For each BP-RNN of D 3 , a single neuron is optimized with the parameters described in section IV-A.…”
Section: B Ccsds Codementioning
confidence: 99%
“…Employing directly the observed LLRs for the reliability measure has been investigated as well in [12]. The OSD post-processing is also carried out in [13] after a decoding diversity composed of several BP modeled with recurrent neural networks (BP-RNNs), each being trained to decode a different kind of error events.…”
Section: Introductionmentioning
confidence: 99%
“…40), yet both are surpassed about 0.5dB by the decoder 'NBP-D(10,4,4)' [32], which is slightly more inferior than 'NMS-2-OSD(C-2)-N'. When the order increases to p = 3 for the OSD, the performance of related variants is further improved, among which the 'NMS-2-OSD(D-3)-Y' approaches the SOTA decoder 'D 10 -OSD-2' [33], which is within 0.2dB gap of ML decoding [42],…”
Section: B Decoding Performancementioning
confidence: 99%
“…For (128,64) or (1023,880) code, the DIA procedure accounts for about 220k (31M) Flops with 247 (2071) trainable parameters. In contrast, the SOTA decoder of D 10 -OSD-2(25) [33] requests a bunch of BP-RNN decoders to be trained whose trainable parameters sum up to 10240. Therefore, our approach can greatly relieve the training load.…”
Section: Complexity Analysismentioning
confidence: 99%
See 1 more Smart Citation