Abstract:A formal Bayesian analysis of a mixture model usually leads to intractable calculations, since the posterior distribution takes into account all the partitions of the sample. We present approximation methods which evaluate the posterior distribution and Bayes estimators by Gibbs sampling, relying on the missing data structure of the mixture model. The data augmentation method is shown to converge geometrically, since a duality principle transfers properties from the discrete missing data chain to the parameter… Show more
“…The data augmentation algorithm for Bayesian estimation of mixture models is studied by Diebolt and Robert (1994). In this case, the missing variables are the allocation variables Z i .…”
Section: Data Augmentation Algorithm For Umementioning
“…The data augmentation algorithm for Bayesian estimation of mixture models is studied by Diebolt and Robert (1994). In this case, the missing variables are the allocation variables Z i .…”
Section: Data Augmentation Algorithm For Umementioning
“…This characteristic feature of the Bayes factor greatly complicates its routine application, as in general the diffuse prior densities have to be replaced by conjugate or other prior densities depending on hyperparameters; variations in the values of the hyperparameters may seriously affect the value of the Bayes factor (see Aitkin, 1991, andDebolt andRobert, 1994, for other discussion). The posterior mean of the likelihood does not depend on the diffuse prior constant, and is insensitive to variations in the hyperparameters even if conjugate priors are used (Aitkin, 1991).…”
Section: A New Test Based On the Posterior Bayes Factormentioning
confidence: 99%
“…For larger sample sizes other computational methods for the posterior Bayes factor would be required; methods based on the Gibbs sampler have been suggested by Diebolt and Robert (1994).…”
Section: Test Based On Contiguous Partitionsmentioning
We present a new test for the presence of a normal mixture distribution, based on the posterior Bayes factor of Aitkin (1991). The new test has slightly lower power than the likelihood ratio test. It does not require the computation of the MLEs of the parameters or a search for multiple maxima, but requires computations based on 'classification likelihood' assignments of observations to mixture components.
“…First we simulated 50 observations from 0.36 • N(lll, 529) + 0.64 x N(190, 324) as given in Diebolt and Robert (1991). As mentioned earlier we set ~j = 0.5, s 2 = 0.000002, ~,j = 3.0 for all j to have an approximate non-informative prior selection.…”
Section: Simulation For the Normal Mixturesmentioning
confidence: 99%
“…Diebolt and Robert (1994) study sampling-based approaches to approximating the Bayes estimates for finite mixtures of normal Dey, Kuo and Sahu distributions assuming the number of components k is known. Crawford (1991) proposes a modification of the Laplace method to estimate the Bayes estimators.…”
This paper describes a Bayesian approach to mixture modelling and a method based on predictive distribution to determine the number of components in the mixtures. The implementation is done through the use of the Gibbs sampler. The method is described through the mixtures of normal and gamma distributions. Analysis is presented in one simulated and one real data example. The Bayesian results are then compared with the likelihood approach for the two examples.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.