2012
DOI: 10.1016/j.jmp.2012.05.003
|View full text |Cite
|
Sign up to set email alerts
|

The three-class ideal observer for univariate normal data: Decision variable and ROC surface properties

Abstract: Although a fully general extension of ROC analysis to classification tasks with more than two classes has yet to be developed, the potential benefits to be gained from a practical performance evaluation methodology for classification tasks with three classes have motivated a number of research groups to propose methods based on constrained or simplified observer or data models. Here we consider an ideal observer in a task with underlying data drawn from three univariate normal distributions. We investigate the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
5
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
2
1

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 26 publications
0
5
0
Order By: Relevance
“…This means LR 1 and LR 2 are each functions of the single variable x , and so we can express LR 2 as a relation of LR 1 . Using the properties of normal functions, these relations can be “cataloged”, and the results used to analytically calculate operating points of the ideal observer given particular values of the decision criteria trueγ [27]. An example of such a likelihood ratio curve for a particular set of data parameters μ and σ 2 , and decision criteria trueγ, is shown in Fig 1.…”
Section: Theorymentioning
confidence: 99%
See 1 more Smart Citation
“…This means LR 1 and LR 2 are each functions of the single variable x , and so we can express LR 2 as a relation of LR 1 . Using the properties of normal functions, these relations can be “cataloged”, and the results used to analytically calculate operating points of the ideal observer given particular values of the decision criteria trueγ [27]. An example of such a likelihood ratio curve for a particular set of data parameters μ and σ 2 , and decision criteria trueγ, is shown in Fig 1.…”
Section: Theorymentioning
confidence: 99%
“…Our most recently published work finally gave Charles Metz and me hope of breaking this vicious circle; in it, we fully and analytically characterized the ROC operating point behavior of a three-class ideal observer acting on univariate normal underlying data [27]. However, unlike the two-class task for which the sufficient statistic used by the ideal observer is always univariate, the three-class ideal observer makes use of a pair of decision variables.…”
Section: Introductionmentioning
confidence: 99%
“…7 and 8 to be re-expressed in a form more convenient for determining properties of the resulting likelihood ratio curve (maxima, inflection points, and so forth), and further can be shown to ensure that the likelihood ratio curve has finite support (i.e., that both LR 1 and LR 2 have finite maxima). 7 In the bivariate case, this is not possible even in principle, because the covariances cannot be "ranked" in any useful way. (Consider a diagonal covariance matrix with one variance greater than one, and the other less than one; it cannot be said to be less than or greater than the identity matrix, the covariance matrix of class π 3 .)…”
Section: Theorymentioning
confidence: 99%
“…7 Furthermore, we showed that the resulting ROC surface will have dimensionality no greater than four at any point, rather than the dimensionality of five needed for the ROC surface to be non-degenerate (required in order to calculate a suitable performance metric such as SAEC). 8 Although surprising, these results were arguably of limited relevance to a wide variety of medical decisionmaking tasks, including medical imaging, in which the underlying data are multivariate rather than univariate. In the present work, we have investigated the behavior of the three-class ideal observer for underlying data which are drawn from three bivariate normal probability density functions (PDFs).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation