2007 IEEE International Symposium on Information Theory 2007
DOI: 10.1109/isit.2007.4557477
|View full text |Cite
|
Sign up to set email alerts
|

Cooperative Source Coding with Encoder Breakdown

Abstract: DeecI XY x Ri X Ene Dee2-y En~R 2 Fig. 1. Cooperative source coding with encoder breakdownAbstract-This paper provides an inner bound to the ratedistortion region of a source coding setup in which two encoders are allowed some collaboration to describe a pair of discrete memoryless sources. We further require some robustness in case one of the encoders breaks down. This is modeled by having a second decoder, observing the messages from only one of the encoders. We prove the tightness of this inner bound for tw… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
14
0

Year Published

2009
2009
2012
2012

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 13 publications
(14 citation statements)
references
References 7 publications
0
14
0
Order By: Relevance
“…A seemingly related problem of Vasudevan and Perron [10] does not provide too much further insight into the ratemalleability region. Relating their problem statement to our problem statement requires the rate R 1 in our problem to be set to 0 and the decoder for Y to decode both (X,Ŷ ).…”
Section: Connectionsmentioning
confidence: 97%
“…A seemingly related problem of Vasudevan and Perron [10] does not provide too much further insight into the ratemalleability region. Relating their problem statement to our problem statement requires the rate R 1 in our problem to be set to 0 and the decoder for Y to decode both (X,Ŷ ).…”
Section: Connectionsmentioning
confidence: 97%
“…The Gaussian example with helper and without side information Z, was solved in [5]. Their result will be used in the sequel to establish our converse for the Gaussian model.…”
Section: B the Gaussian Casementioning
confidence: 99%
“…Without loss of generality, the encoders can subtract Z from X and Y ; hence the problem is equivalent to new rate distortion problem with a helper, where the source is A and the helper is A + B. Now using the result for the Gaussian case from [5], adapted to our notation, we obtain R ≥ 1 2 log…”
Section: B the Gaussian Casementioning
confidence: 99%
“…Of special interest in lossy source coding is the Gaussian case with quadratic distortion, which in many source coding problems is amenable to an analytical solution such as in the Wyner--Ziv problem [12] where side information is available to the decoder, the Heegard--Berger problem [13] where side information at the decoder may be absent, Kaspi's problem [14], [15] where side information is known to the encoder and may or may not be known to the decoder, the multiple description problem [16], [17], the two-way source coding problem [18], the multiterminal problem [19], [20], the CEO problem [21]- [23], rate distortion with a helper [24], [25], and successive refinement [26] and its extension to successive refinement for the Wyner--Ziv problem [27].…”
mentioning
confidence: 99%