2017
DOI: 10.1109/tit.2017.2713833
|View full text |Cite
|
Sign up to set email alerts
|

Approximate Message-Passing Decoder and Capacity Achieving Sparse Superposition Codes

Abstract: We study the approximate message-passing decoder for sparse superposition coding on the additive white Gaussian noise channel and extend our preliminary work [1]. We use heuristic statistical-physics-based tools such as the cavity and the replica methods for the statistical analysis of the scheme. While superposition codes asymptotically reach the Shannon capacity, we show that our iterative decoder is limited by a phase transition similar to the one that happens in Low Density Parity check codes. We consider … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

5
179
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 109 publications
(184 citation statements)
references
References 57 publications
5
179
0
Order By: Relevance
“…The replica symmetric formula for the potential of random linear estimation was recently proved rigorously [8]. The proof of the SE equations of [18] for general signal distributions fixes a conjecture in the proof of [23] and shows that the replica symmetric formula holds true when the information vector m follows a block-iid distribution. In this work we study sums of K a SPARC signals.…”
Section: A Prior Work and Contributionmentioning
confidence: 99%
“…The replica symmetric formula for the potential of random linear estimation was recently proved rigorously [8]. The proof of the SE equations of [18] for general signal distributions fixes a conjecture in the proof of [23] and shows that the replica symmetric formula holds true when the information vector m follows a block-iid distribution. In this work we study sums of K a SPARC signals.…”
Section: A Prior Work and Contributionmentioning
confidence: 99%
“…This base matrix construction was also used in [79] for SC-SPARCs. Other base matrix constructions can be found in [72,35,8,12].…”
Section: Spatially Coupled Sparc Constructionmentioning
confidence: 99%
“…In [8], a small fraction of sections of β are fixed a priori -this pinning condition is used to analyze the state evolution equations via the potential function method. Analogously, in the construction in [12], additional rows are introduced in the design matrix for the blocks corresponding to the first row of the base matrix. In an (ω, Λ) base matrix, the fact that the number of rows in the base matrix exceeds the number of columns by (ω − 1) helps decoding start from both ends.…”
Section: Spatially Coupled Sparc Constructionmentioning
confidence: 99%
“…Also, recent work by Fengler, Jung, and Caire [10] draws a close connection between the sparse structure created by CCS and sparse regression codes (SPARCs) [11]- [13]. Therein, they leverage the CCS data structure, but pair it with a dense CS matrix (rather than the CCS block diagonal structure) and employ approximate message passing (AMP) [14], [15] to decode it.…”
Section: Introductionmentioning
confidence: 99%