1993
DOI: 10.1109/26.237850
|View full text |Cite
|
Sign up to set email alerts
|

Relation of signal set choice to the performance of optimal non-Gaussian detectors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
17
0

Year Published

2005
2005
2010
2010

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 31 publications
(18 citation statements)
references
References 10 publications
1
17
0
Order By: Relevance
“…While the Chernoff bound [11] proves an upper bound on error probability with respect to KL divergence in hypothesis testing, the bound has been found to be loose [9]. Though not a provable bound (upper or lower) on error probability, KL does provide a close proxy; supporting arguments in addition to that given above can be found in [9], [10].…”
Section: B Pairwise Error Using Kullback-leibler Divergencesupporting
confidence: 49%
See 1 more Smart Citation
“…While the Chernoff bound [11] proves an upper bound on error probability with respect to KL divergence in hypothesis testing, the bound has been found to be loose [9]. Though not a provable bound (upper or lower) on error probability, KL does provide a close proxy; supporting arguments in addition to that given above can be found in [9], [10].…”
Section: B Pairwise Error Using Kullback-leibler Divergencesupporting
confidence: 49%
“…This problem arises when trying to calculate the probability of error between two distributions parameterized by different target locations x m . In lieu of a closed-form expression, Kullback-Leibler (KL) divergence has been used as a close proxy to pairwise error probability [7], [9], [10]. We motivate this relationship by returning to the two-target case of the ML decode rule (5) and rewriting it as…”
Section: B Pairwise Error Using Kullback-leibler Divergencementioning
confidence: 99%
“…Ultimately, for a given γ, σ 2 , number of signal points, and an average transmit power constraint, one might want to find the optimal signal constellation set that minimizes the SER. However, it is known that for non-Gaussian noise statistics, no analytical results for the optimal signal set exist [18], and numerical methods that are developed in solving such optimization problems are rather complicated [19]. Therefore, we only focus on four-point signal constellation optimization with some symmetry constraints such that the problem is more tractable, and the solution is more attractive for practical implementation purposes.…”
Section: Signal Constellation Optimizationmentioning
confidence: 99%
“…Owing to the noisy Poisson model, particular spike counts will erroneously decode a target mЈ when in fact the presented target was m. There is no closed-form expression for the probability of decode error between two Poisson noise distributions (Verdu 1986). Kullback-Leibler (KL) divergence is often used as a close proxy to pairwise error probability (Gockenbach and Kearsley 1999;Johnson and Orsak 1993;Johnson et al 2001). KL divergence measures how different two probability distributions are.…”
Section: Optimal Target Placement Algorithmmentioning
confidence: 99%
“…The use of KL as a proxy to error probability is intuitively sound, and our simulations have shown that increasing KL divergence (making the two distributions more different) corresponds well to decreasing error probability. KL is commonly used when error probability is not closed form (Gockenbach and Kearsley 1999;Johnson and Orsak 1993;Johnson et al 2001) with the understanding that making distributions more distinguishable (increasing the KL divergence) will generally reduce probability of error also. The relationship between KL and error probability can be motivated mathematically by returning to the two-target case of the ML decode rule (Eq.…”
Section: Optimal Target Placement Algorithmmentioning
confidence: 99%