Statistical Problems in Particle Physics, Astrophysics and Cosmology 2006
DOI: 10.1142/9781860948985_0021
|View full text |Cite
|
Sign up to set email alerts
|

Limits and Confidence Intervals in the Presence of Nuisance Parameters

Abstract: We study the frequentist properties of confidence intervals computed by the method known to statisticians as the Profile Likelihood. It is seen that the coverage of these intervals is surprisingly good over a wide range of possible parameter values for important classes of problems, in particular whenever there are additional nuisance parameters with statistical or systematic errors. Programs are available for calculating these intervals.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
68
0
2

Year Published

2011
2011
2023
2023

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 55 publications
(72 citation statements)
references
References 14 publications
2
68
0
2
Order By: Relevance
“…A eff (E ′ ; ∆E) is the effective area for events with measured energy within ∆E, as a function of true energy E ′ ; N UL ex is computed using global N ON , N OFF and τ values and the conventional method [37], for a 95% CL and assuming a systematic uncertainty on the overall detection efficiency of 30%. A eff (E ′ ; ∆E) is computed for the entire sample as the weighted average of the effective areas of the four considered data sets, with weights being the corresponding observation times, i.e.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…A eff (E ′ ; ∆E) is the effective area for events with measured energy within ∆E, as a function of true energy E ′ ; N UL ex is computed using global N ON , N OFF and τ values and the conventional method [37], for a 95% CL and assuming a systematic uncertainty on the overall detection efficiency of 30%. A eff (E ′ ; ∆E) is computed for the entire sample as the weighted average of the effective areas of the four considered data sets, with weights being the corresponding observation times, i.e.…”
Section: Discussionmentioning
confidence: 99%
“…We note that the results obtained this way are conservative (i.e. they may have a slight over-coverage, see [37]), since negative fluctuations cannot produce artificially constraining limits.…”
Section: Full Likelihood Analysismentioning
confidence: 97%
“…While there are possible ways to incorporate systematic uncertainties without leaving the frequentist framework (see, for example, [104]), there are some potential drawbacks with these solutions. After comparing the characteristics of a few options (see Section 8.3.3), we decided to use a semi-Bayesian technique to incorporate systematic errors.…”
Section: Systematic Error Incorporationmentioning
confidence: 99%
“…To compare this Bayesian method with the frequentist approach, we have also determined the 90% upper limit for the source rate using the Feldman-Cousins [9] and Rolke et al [10] We see that the Bayesian approach is equal to the Rolke et al method and is more conservative than the Feldman-Cousins method. To further test the upper limit calculation, we have generated source signals (or under-fluctuations) by increasing (or decreasing) the number of events in the on-source region, while keeping the same number of background events as in the data.…”
Section: Determination Of the Source Ratementioning
confidence: 99%
“…In the light of a flux upper limit determination we will discuss a method following the Bayesian framework and a comparison will be made to the standard frequentist methods developed by Feldman and Cousins [9] and Rolke et al [10]. The Feldman-Cousins method introduces the likelihood ratio as an ordering principle, when determining the acceptance interval from which one derives the confidence interval.…”
Section: Introductionmentioning
confidence: 99%