2014
DOI: 10.1109/tit.2014.2352300
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotically Optimal Decision Rules for Joint Detection and Source Coding

Abstract: The problem of joint detection and lossless source coding is considered. We derive asymptotically optimal decision rules for deciding whether or not a sequence of observations has emerged from a desired information source, and to compress it if has. In particular, our decision rules asymptotically minimize the cost of compression in the case that the data has been classified as 'desirable', subject to given constraints on the two kinds of the probability of error. In another version of this performance criteri… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
5
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…In Subsection 2.3, we have proved the well-known fact the optimal list decoder provides the L messages with highest likelihoods. This proof suggests an extension to a recent work [17] (see also [18]) concerning a decoder/detector that first has to decide whether the received channel output y really contains a transmitted message or it is simply pure noise (that is received when the transmitter is silent), and in the former case to decode the message in the ordinary way (L = 1).…”
Section: Future Workmentioning
confidence: 75%
“…In Subsection 2.3, we have proved the well-known fact the optimal list decoder provides the L messages with highest likelihoods. This proof suggests an extension to a recent work [17] (see also [18]) concerning a decoder/detector that first has to decide whether the received channel output y really contains a transmitted message or it is simply pure noise (that is received when the transmitter is silent), and in the former case to decode the message in the ordinary way (L = 1).…”
Section: Future Workmentioning
confidence: 75%
“…This proof suggests an extension to a recent work [17] (see also [18]) concerning a decoder/detector that first has to decide whether the received channel output y really contains a transmitted message or it is simply pure noise (that is received when the transmitter is silent), and in the former case to decode the message in the ordinary way (L = 1).…”
Section: Future Workmentioning
confidence: 75%
“…The main challenge in analyzing the random coding FA exponent, is that the likelihoods of both hypotheses, namely M m=1 W (Y|X m ) and M m=1 V (Y|X m ) are very correlated due to the fact the once the codewords are drawn, they are common for both likelihoods. This is significantly different from the situation in [10], in which the likelihood M m=1 W (Y|X m ) was compared to a likelihood Q 0 (Y), of a completely different distribution 9 . We first make the following observation.…”
Section: A Exact Random Coding Exponentsmentioning
confidence: 82%
“…This problem of joint detection/decoding belongs to a larger class of hypothesis testing problems, in which after performing the test, another task should be performed, depending on the chosen hypothesis. For example, in [7], [8], the problem of joint hypothesis testing and Bayesian estimation was considered, and in [9] the subsequent task is lossless source coding. A common theme for all the problems in this class, is that separately optimizing the detection and the task is sub-optimal, and so, joint optimization is beneficial.…”
mentioning
confidence: 99%