2005
DOI: 10.1109/tit.2004.840883
|View full text |Cite
|
Sign up to set email alerts
|

Bounds on Information Combining

Abstract: Abstract-When the same data sequence is transmitted over two independent channels, or when a data sequence is transmitted twice but independently over the same channel, the independent observations can be combined at the receiver side. From an information-theory point of view, the overall mutual information between the data sequence and the received sequences represents a combination of the mutual information of the two channels. This concept is termed information combining. In this paper, a lower bound and an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

4
103
0

Year Published

2005
2005
2016
2016

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 89 publications
(107 citation statements)
references
References 21 publications
4
103
0
Order By: Relevance
“…As a special case, if all channels have the same distribution, the average extrinsic mutual information is (13) where the th moment is defined in (8).…”
Section: -Consistency and Mutual Informationmentioning
confidence: 99%
See 4 more Smart Citations
“…As a special case, if all channels have the same distribution, the average extrinsic mutual information is (13) where the th moment is defined in (8).…”
Section: -Consistency and Mutual Informationmentioning
confidence: 99%
“…Without loss of generality, one can focus on and ask the following question: can we find tight lower and upper bounds on the combined information , i.e., the extrinsic mutual information? Such an extremal problem was first considered in [12] for a repetition code. It was shown that is maximized (or minimized) when both and are BECs (or binary-symmetric channels (BSCs)) with prescribed mutual information values.…”
mentioning
confidence: 99%
See 3 more Smart Citations