2009
DOI: 10.1109/tit.2008.2009855
|View full text |Cite
|
Sign up to set email alerts
|

Detection of Gauss–Markov Random Fields With Nearest-Neighbor Dependency

Abstract: Abstract-The problem of hypothesis testing against independence for a Gauss-Markov random field (GMRF) is analyzed. Assuming an acyclic dependency graph, an expression for the log-likelihood ratio of detection is derived. Assuming random placement of nodes over a large region according to the Poisson or uniform distribution and nearest-neighbor dependency graph, the error exponent of the Neyman-Pearson detector is derived using large-deviations theory. The error exponent is expressed as a dependency-graph func… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
20
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
2
1

Relationship

1
6

Authors

Journals

citations
Cited by 46 publications
(20 citation statements)
references
References 48 publications
0
20
0
Order By: Relevance
“…For differentiating between known signals in a Gaussian noise, overall performance improves with sensor density, whereas, for the detection of a Gaussian signal embedded in Gaussian noise, a finite sensor density is optimal [44]. Results of large deviations have also been applied to the problem of hypothesis testing against independence for a Gauss-Markov random field, where the error exponent for the Neyman-Pearson hypothesis testing is analysed for different values of the variance ratio and correlation [46].…”
Section: (E) Correlated Observationsmentioning
confidence: 99%
“…For differentiating between known signals in a Gaussian noise, overall performance improves with sensor density, whereas, for the detection of a Gaussian signal embedded in Gaussian noise, a finite sensor density is optimal [44]. Results of large deviations have also been applied to the problem of hypothesis testing against independence for a Gauss-Markov random field, where the error exponent for the Neyman-Pearson hypothesis testing is analysed for different values of the variance ratio and correlation [46].…”
Section: (E) Correlated Observationsmentioning
confidence: 99%
“…Under the NeymanPearson criterion, for a fixed Type-I error bound, the exponent of the Type-II error is independent of the type-I error bound [17], and is given by (24) In the following theorem, we restate the closed form for the error exponent, derived 3 in [2], in terms of the variables and functions defined here. (25) (26) (27) (28) where is the correlation function.…”
Section: Detection Error Exponentmentioning
confidence: 99%
“…(25) (26) (27) (28) where is the correlation function. Let denote the Rayleigh random variable with variance and let be the area of the union of two unit-radii circles with centers unit distance apart, given by (29) Theorem 3 (Expression for ): For a GMRF on NNG with correlation function , with the nodes drawn from the binomial 3 The expression given in [2] is in a different form, but reduces to (30).…”
Section: Detection Error Exponentmentioning
confidence: 99%
See 1 more Smart Citation
“…Hence, we focus on the large-network scenario, where the number of observations goes to infinity. For any positive fixed level of false alarm or the type-I error probability, when the misdetection or the type-II error probability of the NP detector decays exponentially with the sample size , we have the error exponent defined by (1) 0018-9448/$25.00 © 2009 IEEE The error exponent is an important performance measure since a large exponent implies faster decay of error probability with increasing sample size.…”
mentioning
confidence: 99%