2012 50th Annual Allerton Conference on Communication, Control, and Computing (Allerton) 2012
DOI: 10.1109/allerton.2012.6483463
|View full text |Cite
|
Sign up to set email alerts
|

Sparse regression codes for multi-terminal source and channel coding

Abstract: Abstract-We study a new class of codes for Gaussian multiterminal source and channel coding. These codes are designed using the statistical framework of high-dimensional linear regression and are called Sparse Superposition or Sparse Regression codes. Codewords are linear combinations of subsets of columns of a design matrix. These codes were introduced by Barron and Joseph and shown to achieve the channel capacity of AWGN channels with computationally feasible decoding. They have also recently been shown to a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
2

Relationship

3
4

Authors

Journals

citations
Cited by 12 publications
(11 citation statements)
references
References 38 publications
0
11
0
Order By: Relevance
“…The results of this paper together with those in [5] show that SPARCs with computationally efficient encoders and decoders can be used for both lossy compression and communication, at rates approaching the Shannon-theoretic limits. Further, [15] demonstrates how source and channel coding SPARCs can be nested to implement binning and superposition, which are key ingredients of coding schemes for multi-terminal source and channel coding problems. Sparse regression codes therefore offer a promising framework to develop fast, rate-optimal codes for a variety of models in network information theory.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…The results of this paper together with those in [5] show that SPARCs with computationally efficient encoders and decoders can be used for both lossy compression and communication, at rates approaching the Shannon-theoretic limits. Further, [15] demonstrates how source and channel coding SPARCs can be nested to implement binning and superposition, which are key ingredients of coding schemes for multi-terminal source and channel coding problems. Sparse regression codes therefore offer a promising framework to develop fast, rate-optimal codes for a variety of models in network information theory.…”
Section: Discussionmentioning
confidence: 99%
“…To approach D * (R), note that we need n, L, M to all go to ∞ while satisfying (1): n, L for the probability of error in (15) to be small, and M in order to allow δ 2 to be small according to (23). When L, M grow polynomially in n, (23) dictates how small ∆ can be: the distortion is Θ log log M log M higher than the optimal value D * (R) = σ 2 e −2R .…”
Section: Remarksmentioning
confidence: 99%
See 1 more Smart Citation
“…In [10], [11], we showed that SPARCs also attain the optimal rate-distortion function of Gaussian sources with feasible algorithms. Further, we showed in [12] that the source and channel coding modules can be combined to implement superposition and binning, which are key ingredients of several multi-terminal source and channel coding problems. Thus SPARCs offer a promising framework to build fast, rateoptimal codes for a variety of problems in network information theory.…”
Section: Introductionmentioning
confidence: 99%
“…This monograph discusses a class of codes for such Gaussian models called Sparse Superposition Codes or Sparse Regression Codes (SPARCs). These codes were introduced by Barron and Joseph [16,65] for efficient communication over AWGN channels, but have since also been used for lossy compression [112,113] and multi-terminal communication [114]. Our goal in this monograph is to provide a unified and comprehensive view of SPARCs, covering theory, algorithms, as well as practical implementation aspects.…”
Section: Introductionmentioning
confidence: 99%