1998
DOI: 10.1080/10618600.1998.10474772
|View full text |Cite
|
Sign up to set email alerts
|

Estimating Mixture of Dirichlet Process Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
275
0

Year Published

1998
1998
2021
2021

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 369 publications
(275 citation statements)
references
References 16 publications
0
275
0
Order By: Relevance
“…With a finite number of clusters, this corresponds to fitting a finite mixture of distributions to the data (McLachlan and Peel, 2000), through likelihood-based methods such as expectation-maximization (Dempster et al, 1977) or Markov chain Monte Carlo (MCMC) methods in the Bayesian framework (Diebolt and Robert, 1994). The model can be generalized to a countably infinite number of clusters, by using a prior such as the Dirichlet process prior for mixture components that allows a few components to dominate (MacEachern and Müller, 1998). Such methods have become feasible with the development of powerful MCMC tools.…”
Section: Clustering Methodsmentioning
confidence: 99%
“…With a finite number of clusters, this corresponds to fitting a finite mixture of distributions to the data (McLachlan and Peel, 2000), through likelihood-based methods such as expectation-maximization (Dempster et al, 1977) or Markov chain Monte Carlo (MCMC) methods in the Bayesian framework (Diebolt and Robert, 1994). The model can be generalized to a countably infinite number of clusters, by using a prior such as the Dirichlet process prior for mixture components that allows a few components to dominate (MacEachern and Müller, 1998). Such methods have become feasible with the development of powerful MCMC tools.…”
Section: Clustering Methodsmentioning
confidence: 99%
“…1 possible values and the other parameters can be updated using standard techniques for hierarchical models. Alternative efficient computational methods have been proposed for non-conjugate models (MacEachern and Müller 1998;Neal 2000).…”
Section: Hðbþð1àhðbþþ Mþ1mentioning
confidence: 99%
“…Escobar and West (1995) provided a Gibbs sampling algorithm for the estimation of posterior distribution for all model parameters, MacEachern and Müller (1998) presented a Gibbs sampler with non-conjugate priors by using auxiliary parameters, and Neal (2000) provided an extended and more efficient Gibbs sampler to handle general Dirichlet process mixture models. Teh et al (2006) also extended the auxiliary variable method of Escobar and West (1995) for posterior sampling of the precision parameter with a gamma prior.…”
Section: Sampling Schemes For Glmdmsmentioning
confidence: 99%