2010
DOI: 10.1080/15598608.2010.10412000
|View full text |Cite
|
Sign up to set email alerts
|

Nonparametric Predictive Category Selection for Multinomial Data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
12
0

Year Published

2012
2012
2023
2023

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(12 citation statements)
references
References 11 publications
0
12
0
Order By: Relevance
“…For unordered categorical data, NPI has been developed (Coolen and Augustin 2009), and it has already been applied to category selection (Baker and Coolen 2010). There is an interesting opportunity to apply that method to problems involving diagnostic tests, particularly if a loss function can be identified that quantifies the consequences of wrong diagnoses.…”
Section: Discussionmentioning
confidence: 99%
“…For unordered categorical data, NPI has been developed (Coolen and Augustin 2009), and it has already been applied to category selection (Baker and Coolen 2010). There is an interesting opportunity to apply that method to problems involving diagnostic tests, particularly if a loss function can be identified that quantifies the consequences of wrong diagnoses.…”
Section: Discussionmentioning
confidence: 99%
“…Then, the majority vote is taken for classification. There are also several other works on classification trees from imprecise probability perspective, and based on Nonparametric Predictive Inference [3,14,35]. Abellán et al [3] and Baker [14] show that applying the NPI-M to classification trees leads to slightly better classification accuracy than the IDM.…”
Section: Imprecise Probabilitiesmentioning
confidence: 99%
“…There are also several other works on classification trees from imprecise probability perspective, and based on Nonparametric Predictive Inference [3,14,35]. Abellán et al [3] and Baker [14] show that applying the NPI-M to classification trees leads to slightly better classification accuracy than the IDM. In this thesis, we also introduce a new application of classification trees with imprecise probability, and based on the Nonparametric Predictive Inference approach.…”
Section: Imprecise Probabilitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…A (n) is not su cient to derive precise probabilities for many events of interest, but optimal bounds for probabilities for all events of interest can be derived via the 'fundamental theorem of probability' [16]. These optimal bounds are lower and upper probabilities [1,2], and are applied in NPI to a range of statistical applications, where through the use of latent variable representations also methods for Bernoulli and multinomial data have been developed [3,6,7,9,10]. A generalization of A (n) in order to deal with right-censored observations has also been presented [12] and was e.g.…”
Section: Preliminariesmentioning
confidence: 99%