Optical Fiber Communication Conference (OFC) 2020 2020
DOI: 10.1364/ofc.2020.th1g.4
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Distribution Matching: a Versatile Tool for Probabilistic Shaping

Abstract: The hierarchical distribution matching (Hi-DM) approach for probabilistic shaping is described. The potential of Hi-DM in terms of trade-off between performance, complexity, and memory is illustrated through three case studies. © 2020 The Author(s) OCIS codes: 060.1660, 060.2330, 060.4080 IntroductionRecently, probabilistic shaping (PS) techniques have been widely investigated to improve the performance and the flexibility of optical fiber networks. By assigning different probabilities to the constellation sym… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 10 publications
(5 citation statements)
references
References 7 publications
0
5
0
Order By: Relevance
“…In practice, a DM with a reasonably low rate loss requires a long block length (typically, hundreds of symbols), making its implementation based on a single look-up-table (LUT) unfeasible. Finding a good trade-off between rate loss and implementation complexity is a key aspect in the design of DMs.Different techniques for the implementation of DMs have been proposed in the past years, including the enumerative sphere shaping (ESS) and its variations [24]- [26]; the constant composition distribution matcher (CCDM) and its variations [10], [22]; and the hierarchical DM (HiDM), which, using multiple DM layers (each based, for instance, on ESS, CCDM, or even LUTs), achieves a lower rate loss compared to the equivalent-complexity single-layer DM [12], [13], [27]. We refer to [20] and references therein for more details about DM implementations.…”
Section: Probabilistic Constellation Shapingmentioning
confidence: 99%
“…In practice, a DM with a reasonably low rate loss requires a long block length (typically, hundreds of symbols), making its implementation based on a single look-up-table (LUT) unfeasible. Finding a good trade-off between rate loss and implementation complexity is a key aspect in the design of DMs.Different techniques for the implementation of DMs have been proposed in the past years, including the enumerative sphere shaping (ESS) and its variations [24]- [26]; the constant composition distribution matcher (CCDM) and its variations [10], [22]; and the hierarchical DM (HiDM), which, using multiple DM layers (each based, for instance, on ESS, CCDM, or even LUTs), achieves a lower rate loss compared to the equivalent-complexity single-layer DM [12], [13], [27]. We refer to [20] and references therein for more details about DM implementations.…”
Section: Probabilistic Constellation Shapingmentioning
confidence: 99%
“…We do not need an operational mode change such as updating the source statistics based on prior knowledge, although this would be a kind of data compression. Our previously proposed look-up table (LUT)-based hierarchical DM [18] and following works [29], [30] are applicable for this purpose without significantly increased complexity, but rather a reordering of the LUT entries. The proposed technique can reduce the rate losses associated with source and channel coding compared with state-of-the-art DM schemes such as constant-composition DM (CCDM) [15], or reduce the power consumption in the FEC at a given information rate and signal-to-noise ratio (SNR) by relaxing the FEC performance.…”
Section: Introductionmentioning
confidence: 99%
“…In [ 73 ], a shaper based on ESS was introduced to shape a subset of the amplitude bit-levels, which is referred to as partial ESS. The authors of [ 74 ] introduced the “hierarchical” DM which realizes a nonuniform distribution with hierarchical lookup tables (LUTs) [ 75 ]. An approximate sphere shaping implementation based on Huffman codes was proposed in [ 76 ].…”
Section: Introductionmentioning
confidence: 99%