2007
DOI: 10.1109/tit.2007.892807
|View full text |Cite
|
Sign up to set email alerts
|

Tightened Upper Bounds on the ML Decoding Error Probability of Binary Linear Block Codes

Abstract: The performance of maximum-likelihood (ML) decoded binary linear block codes is addressed via the derivation of tightened upper bounds on their decoding error probability. The upper bounds on the block and bit error probabilities are valid for any memoryless, binary-input and outputsymmetric communication channel, and their effectiveness is exemplified for various ensembles of turbo-like codes over the AWGN channel. An expurgation of the distance spectrum of binary linear block codes further tightens the resul… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2011
2011
2015
2015

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 23 publications
(8 citation statements)
references
References 27 publications
0
8
0
Order By: Relevance
“…As shown in [25,Appendix D], the average weight spectra of a random linear code [n, k] can be found as…”
Section: Numerical Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…As shown in [25,Appendix D], the average weight spectra of a random linear code [n, k] can be found as…”
Section: Numerical Resultsmentioning
confidence: 99%
“…One objective of this paper is, without too much complexity increase, to reduce the number of involved terms in the conventional union bound. The other objective of this paper is to tighten the bound on Pr{E d }, which used to be upper-bounded by the pair-wise error DRAFT October 25,2018 probability, where intersections of half-spaces related to codewords other than the transmitted one are counted more than once. For some of well-known existing improved bounds based on GFBT, such as the sphere bound (SB), the tangential-sphere bound (TSB) and the Divsalar bound, see the monograph [2, Ch.…”
Section: B Union Boundsmentioning
confidence: 99%
See 2 more Smart Citations
“…Many tight ML upper bounds originate from a general bounding technique, developed by Gallager [12]. Gallager's technique has been utilized extensively in literature [5], [13], [14]. Similar forms of the general bound, displayed hereafter, have been previously presented in literature [6], [15].…”
Section: A General ML Decoding Upper Boundmentioning
confidence: 98%