2008
DOI: 10.1109/tit.2007.911266
|View full text |Cite
|
Sign up to set email alerts
|

Extremal Problems of Information Combining

Abstract: Abstract-In this paper, we study moments of soft bits of binary-input symmetric-output channels and solve some extremal problems of the moments. We use these results to solve the extremal information combining problem. Further, we extend the information combining problem by adding a constraint on the second moment of soft bits, and find the extremal distributions for this new problem. The results for this extension problem are used to improve the prediction of convergence of the belief propagation decoding of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
24
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(24 citation statements)
references
References 22 publications
0
24
0
Order By: Relevance
“…Remark 10: This lemma is in fact equivalent to the statement in [20,Theorem 1] with the extreme values derived in its proof (note that (20) implies that the sequence {g k } is equal to the sequence {m 2k } in [20], from which the equivalence between Lemma 5 and [20, Theorem 1] follows directly). In Appendix II, we present an alternative proof which is more elementary.…”
Section: Lemma 5: [Extreme Values Of G 1 Among All Mbios Channels Witmentioning
confidence: 87%
See 4 more Smart Citations
“…Remark 10: This lemma is in fact equivalent to the statement in [20,Theorem 1] with the extreme values derived in its proof (note that (20) implies that the sequence {g k } is equal to the sequence {m 2k } in [20], from which the equivalence between Lemma 5 and [20, Theorem 1] follows directly). In Appendix II, we present an alternative proof which is more elementary.…”
Section: Lemma 5: [Extreme Values Of G 1 Among All Mbios Channels Witmentioning
confidence: 87%
“…Note that for a BEC with erasure probability p, g k = 1 − p for all k ∈ N (in this case we have L ∈ {0, +∞} with probabilities p and 1 − p, respectively, and the equality tanh(+∞) = 1 is exploited in (20)). Therefore (39) is particularized to…”
Section: A Proof Of Theoremmentioning
confidence: 99%
See 3 more Smart Citations