1994
DOI: 10.1111/j.2517-6161.1994.tb01985.x
|View full text |Cite
|
Sign up to set email alerts
|

Estimation of Finite Mixture Distributions Through Bayesian Sampling

Abstract: A formal Bayesian analysis of a mixture model usually leads to intractable calculations, since the posterior distribution takes into account all the partitions of the sample. We present approximation methods which evaluate the posterior distribution and Bayes estimators by Gibbs sampling, relying on the missing data structure of the mixture model. The data augmentation method is shown to converge geometrically, since a duality principle transfers properties from the discrete missing data chain to the parameter… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
477
0
5

Year Published

1995
1995
2016
2016

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 660 publications
(482 citation statements)
references
References 14 publications
0
477
0
5
Order By: Relevance
“…The data augmentation algorithm for Bayesian estimation of mixture models is studied by Diebolt and Robert (1994). In this case, the missing variables are the allocation variables Z i .…”
Section: Data Augmentation Algorithm For Umementioning
confidence: 99%
“…The data augmentation algorithm for Bayesian estimation of mixture models is studied by Diebolt and Robert (1994). In this case, the missing variables are the allocation variables Z i .…”
Section: Data Augmentation Algorithm For Umementioning
confidence: 99%
“…This characteristic feature of the Bayes factor greatly complicates its routine application, as in general the diffuse prior densities have to be replaced by conjugate or other prior densities depending on hyperparameters; variations in the values of the hyperparameters may seriously affect the value of the Bayes factor (see Aitkin, 1991, andDebolt andRobert, 1994, for other discussion). The posterior mean of the likelihood does not depend on the diffuse prior constant, and is insensitive to variations in the hyperparameters even if conjugate priors are used (Aitkin, 1991).…”
Section: A New Test Based On the Posterior Bayes Factormentioning
confidence: 99%
“…For larger sample sizes other computational methods for the posterior Bayes factor would be required; methods based on the Gibbs sampler have been suggested by Diebolt and Robert (1994).…”
Section: Test Based On Contiguous Partitionsmentioning
confidence: 99%
“…First we simulated 50 observations from 0.36 • N(lll, 529) + 0.64 x N(190, 324) as given in Diebolt and Robert (1991). As mentioned earlier we set ~j = 0.5, s 2 = 0.000002, ~,j = 3.0 for all j to have an approximate non-informative prior selection.…”
Section: Simulation For the Normal Mixturesmentioning
confidence: 99%
“…Diebolt and Robert (1994) study sampling-based approaches to approximating the Bayes estimates for finite mixtures of normal Dey, Kuo and Sahu distributions assuming the number of components k is known. Crawford (1991) proposes a modification of the Laplace method to estimate the Bayes estimators.…”
Section: Introductionmentioning
confidence: 99%