2022
DOI: 10.1109/tcomm.2021.3114315
|View full text |Cite
|
Sign up to set email alerts
|

Guessing Random Additive Noise Decoding With Symbol Reliability Information (SRGRAND)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
27
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 39 publications
(28 citation statements)
references
References 90 publications
1
27
0
Order By: Relevance
“…In this section, we analyze the proposed joint decryptiondecoding scheme as presented in Section IV using GRAND [7]. GRAND algorithms operate by sequentially inverting putative noise effects from the demodulated received sequence and querying if what remains is in the code-book [7], [21]- [23]. If those noise effects are queried in order from mostlikely to least likely, the first instance where a code-book member is found is an ML decoding [7].…”
Section: Decrypting Encoded Datamentioning
confidence: 99%
“…In this section, we analyze the proposed joint decryptiondecoding scheme as presented in Section IV using GRAND [7]. GRAND algorithms operate by sequentially inverting putative noise effects from the demodulated received sequence and querying if what remains is in the code-book [7], [21]- [23]. If those noise effects are queried in order from mostlikely to least likely, the first instance where a code-book member is found is an ML decoding [7].…”
Section: Decrypting Encoded Datamentioning
confidence: 99%
“…Symbol Reliability GRAND (SRGRAND) [38], [39] is a variant that avails of the most limited quantized soft information where one additional bit tags each demodulated symbol as being reliably or unreliably received. SRGRAND retains the desirable parallelizability of the original algorithm, is readily implementable in hardware, and provides a 0.5 − 0.75 dB gain over hard-detection GRAND [39].…”
Section: Introduction Shannon's Pioneering Workmentioning
confidence: 99%
“…Symbol Reliability GRAND (SRGRAND) [38], [39] is a variant that avails of the most limited quantized soft information where one additional bit tags each demodulated symbol as being reliably or unreliably received. SRGRAND retains the desirable parallelizability of the original algorithm, is readily implementable in hardware, and provides a 0.5 − 0.75 dB gain over hard-detection GRAND [39]. At the other extreme, Soft GRAND (SGRAND) [40] is a variant that uses real-valued soft information per demodulated bit to build a dedicated noiseeffect query order for each received signal.…”
Section: Introduction Shannon's Pioneering Workmentioning
confidence: 99%
“…A natural question is how to make use of soft detection information, when it is available, in order to improve GRAND's query order and several proposals have been made. Symbol Reliability GRAND (SRGRAND) [24], [25] avails of the most limited quantized soft information where one additional bit tags each demodulated symbol as being reliably or unreliably received. SRGRAND is mathematically analysable, implementable in hardware, and provides a 0.5 − 0.75 dB gain over hard-detection GRAND [25].…”
Section: Introductionmentioning
confidence: 99%
“…Symbol Reliability GRAND (SRGRAND) [24], [25] avails of the most limited quantized soft information where one additional bit tags each demodulated symbol as being reliably or unreliably received. SRGRAND is mathematically analysable, implementable in hardware, and provides a 0.5 − 0.75 dB gain over hard-detection GRAND [25]. Soft GRAND (SGRAND) [26] uses real-valued soft information per demodulated bit to build a bespoke noise-effect query order for each received signal.…”
Section: Introductionmentioning
confidence: 99%