2015
DOI: 10.4236/ojs.2015.56060
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Prediction of Future Generalized Order Statistics from a Class of Finite Mixture Distributions

Abstract: This article is concerned with the problem of prediction for the future generalized order statistics from a mixture of two general components based on doubly type II censored sample. We consider the one sample prediction and two sample prediction techniques. Bayesian prediction intervals for the median of future sample of generalized order statistics having odd and even sizes are obtained. Our results are specialized to ordinary order statistics and ordinary upper record values. A mixture of two Gompertz compo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 36 publications
0
2
0
Order By: Relevance
“…Most currently Bayesian mixtures are based either in the priors or on the statistical model, not both as the new MBMA described in this paper. For example Abd and Al-Zaydi [37] [38] used statistical mixtures model for order statistics; Al-Hussaini and Hussein [39] for exponential components; Ley and Steel [40] used a prior of mixtures with economic applications. Other Bayesian mixtures include Schäfer et al [41] (spatial clustering), Yao [42] (Bayesian labeling), Sabourin and Naveau [43] (extremes), and Rodrguez and Walker [44] kernel estimation).…”
Section: Discussionmentioning
confidence: 99%
“…Most currently Bayesian mixtures are based either in the priors or on the statistical model, not both as the new MBMA described in this paper. For example Abd and Al-Zaydi [37] [38] used statistical mixtures model for order statistics; Al-Hussaini and Hussein [39] for exponential components; Ley and Steel [40] used a prior of mixtures with economic applications. Other Bayesian mixtures include Schäfer et al [41] (spatial clustering), Yao [42] (Bayesian labeling), Sabourin and Naveau [43] (extremes), and Rodrguez and Walker [44] kernel estimation).…”
Section: Discussionmentioning
confidence: 99%
“…These include Markov chain Monte Carlo methods, non-iterative Monte Carlo methods, and asymptotic methods. Other Bayesian methods based on mixtures include Ley and Steel [40], Liang et al [32], Schäfer et al [41], Rodrguez and Walker [42], and Abd and AlZaydi [43]. Some frequentist mixtures include Abd and Al-Zaydi [44], and AL-Hussaini and Hussein [45].…”
Section: Posterior Model Selection Uncertaintymentioning
confidence: 99%