2019
DOI: 10.1109/jlt.2019.2895065
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Distribution Matching for Probabilistically Shaped Coded Modulation

Abstract: The implementation difficulties of combining distribution matching (DM) and dematching (invDM) for probabilistic shaping (PS) with soft-decision forward error correction (FEC) coding can be relaxed by reverse concatenation, for which the FEC coding and decoding lies inside the shaping algorithms. PS can seemingly achieve performance close to the Shannon limit, although there are practical implementation challenges that need to be carefully addressed. We propose a hierarchical DM (HiDM) scheme, having fully par… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
83
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 88 publications
(83 citation statements)
references
References 36 publications
0
83
0
Order By: Relevance
“…The figure shows that increasing the number of layers, the rate loss significantly decreases with the same overall memory, or, equivalently, the memory required for a certain rate loss can be substantially diminished. Finally, the figure shows that the performance (in terms of rate loss versus memory) of the structure proposed in [6,8] can be even improved with some structures.…”
Section: and Once Dmentioning
confidence: 98%
See 1 more Smart Citation
“…The figure shows that increasing the number of layers, the rate loss significantly decreases with the same overall memory, or, equivalently, the memory required for a certain rate loss can be substantially diminished. Finally, the figure shows that the performance (in terms of rate loss versus memory) of the structure proposed in [6,8] can be even improved with some structures.…”
Section: and Once Dmentioning
confidence: 98%
“…Consequently, the rate R of the DM is lower than the entropy rate H that would be obtained with a sequence of i.i.d. amplitudes with the same target distribution, yielding the rate loss R loss = H − R ≥ 0.Different methods to realize a DM with a low rate loss have been recently proposed [4][5][6][7]. While the rate loss usually tends to zero when the block length N increases (but with different convergence speed for different DMs), this happens at the expense of increased computational cost, memory, and/or latency.…”
mentioning
confidence: 99%
“…The combination of long blocks required for low loss and sequential processing has lead to a great deal of research into advanced amplitude shaper schemes that improve upon CCDM. Examples of fixed-length amplitude shapers include multiset-partition distribution matching (MPDM) [27], multicomposition DM [28], prefix-free code DM with framing [29], arXiv:2006.07045v1 [eess.SP] 12 Jun 2020 enumerative sphere shaping (ESS) [17], shell mapping (SM) [8], [30], and hierarchical DM [31].…”
Section: Introductionmentioning
confidence: 99%
“…For practical communication systems and new requirements such as ultra reliable low latency communication (URLLC), shorter output blocklengths in the range of 10 to 500 symbols are desirable to minimize the processing latency and limit error propagation. Research is therefore now dedicated to find improved DM architectures for short blocklengths, e.g., [8], [9]. Good performance for short blocklengths is also needed to operate several DMs in parallel to further reduce processing latencies.…”
Section: Introductionmentioning
confidence: 99%