2002
DOI: 10.1086/338316
|View full text |Cite
|
Sign up to set email alerts
|

A New Source Detection Algorithm Using the False-Discovery Rate

Abstract: The False Discovery Rate (FDR) method has recently been described by Miller et al. (2001), along with several examples of astrophysical applications. FDR is a new statistical procedure due to Benjamini & Hochberg (1995) for controlling the fraction of false positives when performing multiple hypothesis testing. The importance of this method to source detection algorithms is immediately clear. To explore the possibilities offered we have developed a new task for performing source detection in radio-telescope im… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
131
0

Year Published

2003
2003
2017
2017

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 123 publications
(131 citation statements)
references
References 7 publications
0
131
0
Order By: Relevance
“…This technique was described by Miller et al (2001);Hopkins et al (2002); Starck et al (2006b); Pires et al (2006) for several astrophysical applications. Instead of controlling the chance of any false positives, FDR controls the expected proportion of false positives.…”
Section: Appendix A: the Fdr Methodsmentioning
confidence: 99%
“…This technique was described by Miller et al (2001);Hopkins et al (2002); Starck et al (2006b); Pires et al (2006) for several astrophysical applications. Instead of controlling the chance of any false positives, FDR controls the expected proportion of false positives.…”
Section: Appendix A: the Fdr Methodsmentioning
confidence: 99%
“…Both packages allow calculation of rms and mean images and identification of sources in radio maps through the use either of a false detection rate (FDR) method (Hopkins et al 2002), or of a threshold technique that locates islands of emission above some multiple of the noise in the image. Gaussians are then fitted to each island for accurate measurement of source properties.…”
Section: Source Findingmentioning
confidence: 99%
“…The result of this analysis shows clearly that it will be crucial to first develop and then implement an automated algorithm for establishing the appropriate scale size on which to estimate the 'local' background and rms noise values for each image, and that this may not be common to all images. This analysis also suggests that the false discovery rate approach (Hopkins et al 2002) to thresholding may be the most robust, optimising for both completeness and reliability.…”
Section: Compact Source Extractionmentioning
confidence: 91%