2011
DOI: 10.1016/j.ejor.2011.01.020
|View full text |Cite
|
Sign up to set email alerts
|

Maximising entropy on the nonparametric predictive inference model for multinomial data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
30
0

Year Published

2011
2011
2022
2022

Publication Types

Select...
8
1

Relationship

3
6

Authors

Journals

citations
Cited by 26 publications
(31 citation statements)
references
References 29 publications
0
30
0
Order By: Relevance
“…NPI for system reliability using the signature has also been presented, for systems consisting of only one type of components [4,5,15]. NPI has also been presented for a variety of other problems in operational research and statistics, including predictive analysis for queueing problems [17], replacement problems [24], decision making under uncertain utilities [30] and classification with decision trees using maximum entropy [1,2] (see also www.npi-statistics.com).…”
Section: Nonparametric Predictive Inference For System Failure Timementioning
confidence: 99%
See 1 more Smart Citation
“…NPI for system reliability using the signature has also been presented, for systems consisting of only one type of components [4,5,15]. NPI has also been presented for a variety of other problems in operational research and statistics, including predictive analysis for queueing problems [17], replacement problems [24], decision making under uncertain utilities [30] and classification with decision trees using maximum entropy [1,2] (see also www.npi-statistics.com).…”
Section: Nonparametric Predictive Inference For System Failure Timementioning
confidence: 99%
“…Case 1 only involved an initial assessment for rather trivial values of (l 1 , l 2 , l 3 ) for which the system either functions or not with certainty. Without further calculations, the survival signature is only known to be in [0,1] at all other (l 1 , l 2 , l 3 ). Case 2 shows the effect of calculating Φ(0, 0, 3) = 1/2, Case 3 of the additional calculations Φ(0, 1, 0) = Φ(0, 1, 1) = 1/4, these are all pretty trivial to derive.…”
Section: Examplementioning
confidence: 99%
“…For this aim, algorithms to attain the maximum entropy probability are required; these are presented in Abellán et al [8]. We have used 40 data sets with the common characteristic that the class variable has a known number K ≥ 3 of cases or categories, as was considered in the model presented in Coolen and Augustin [17].…”
Section: Introductionmentioning
confidence: 99%
“…Further extensions include [8] where NPI was considered for cases of two finitely exchangeable populations (where exchangeability is only assumed within populations and not between them), and [1] where algorithms were developed for obtaining maximum entropy for multinomial data. In an Operations Research context, NPI was previously applied to problems of decision modelling in multiple queues situations, e.g., [9], and to support replacement decisions e.g., [12], where the fact that the NPI approach adapts fully to the available data is an attractive feature.…”
Section: Nonparametric Predictive Inferencementioning
confidence: 99%