IEEE Information Theory Workshop 2010 (ITW 2010) 2010
DOI: 10.1109/itwksps.2010.5503192
|View full text |Cite
|
Sign up to set email alerts
|

Sparse superposition codes for Gaussian vector quantization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
13
0

Year Published

2010
2010
2013
2013

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 13 publications
(13 citation statements)
references
References 1 publication
0
13
0
Order By: Relevance
“…Hence for sufficiently large n, P (E 2 | E c 1 , E c 2 ) < /3. Combining this with (17) and (18), we have P (E) < .…”
Section: Let the Number Of Columns In Each Submentioning
confidence: 71%
See 1 more Smart Citation
“…Hence for sufficiently large n, P (E 2 | E c 1 , E c 2 ) < /3. Combining this with (17) and (18), we have P (E) < .…”
Section: Let the Number Of Columns In Each Submentioning
confidence: 71%
“…This is mainly to keep exposition simple and to highlight the main contribution of the paper -a demonstration that binning and superposition can be easily implemented with sparse regression ensembles described by compact dictionaries. The results also hold with the feasible SPARC encoders and decoders developed in [16], [17], [19]. Further, we focus only on the achievability of the optimal information-theoretic rates and do not discuss the SPARC error exponents obtained in [14], [18].…”
Section: Introductionmentioning
confidence: 81%
“…For every positive integer n, let M n = L b n where L n is determined by (3). Then there exists a sequence C = {C n } n=1,2,... of rate R sparse regression codes -with code C n defined by an n × M n L n design matrix -that attains the optimal error exponent for distortion-level D given by (5).…”
Section: Resultsmentioning
confidence: 99%
“…Here, the performance of these codes under minimum-distance encoding is studied. The design of computationally feasible encoders will be discussed in future work.Sparse regression codes for compressing Gaussian sources were first considered in [3] where some preliminary results were presented. In this paper, we analyze the performance of these codes under optimal (minimum-distance) encoding and show that they can achieve the distortion-rate bound with the optimal error exponent for all rates above a specified value (approximately 1.15 bits/sample).…”
mentioning
confidence: 99%
“…In [8], it was shown that SPARCs achieve the AWGN capacity with feasible decoding. SPARCs for lossy compression were first considered in [9]. In [10], [11], we showed that SPARCs also attain the optimal rate-distortion function of Gaussian sources with feasible algorithms.…”
Section: Introductionmentioning
confidence: 99%