2005
DOI: 10.1007/11508069_27
|View full text |Cite
|
Sign up to set email alerts
|

A Dynamic Merge-or-Split Learning Algorithm on Gaussian Mixture for Automated Model Selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
3
0
3

Year Published

2008
2008
2010
2010

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 10 publications
0
3
0
3
Order By: Relevance
“…There are several widely used classes of methods for inferring the number of clusters or Gaussian components: such as measure-based [3] [4][5] [6], Bayesian [7] [8] and populationbased [9] [10], to name a few. The problem of inferring the number of components is a special case of the model selection problem and measure-based class of methods is the most widely used.…”
Section: Introductionmentioning
confidence: 99%
“…There are several widely used classes of methods for inferring the number of clusters or Gaussian components: such as measure-based [3] [4][5] [6], Bayesian [7] [8] and populationbased [9] [10], to name a few. The problem of inferring the number of components is a special case of the model selection problem and measure-based class of methods is the most widely used.…”
Section: Introductionmentioning
confidence: 99%
“…In [3] the authors developed a new methodology for fully Bayesian mixture analysis, using reversible jump Markov chain Monte Carlo methods to jumping the parameter subspaces and the different numbers of components in the mixture, while [4] propose a new kind of dynamic merge-or-split learning (DMOSL) algorithm to deal with the selection of number of Gaussians in the mixture, [5] describe an EM algorithm for nonparametric maximum likelihood (ML) estimation with variance component structure, [6] introduces a greedy algorithm for learning Gaussian mixture model, using combination of global and local search, [7] introduced a splitand-merge operation in order to alleviate the problem of local convergence of the usual EM algorithm, in paper [8] the authors use split-merge algorithm to analyze the data from the tracked video data.…”
Section: Introductionmentioning
confidence: 99%
“…There are many resources about Gaussian models and fast learning algorithms: The authors of [1] develop iterative learning algorithms for estimating a coefficient vector in a given setting of statistical models with unknown hyper parameters, [2] described the globally supervised algorithm for Gaussian mixture models based on the maximum relative entropy (MRE), [3] is about an iterative algorithm for entropy regularized likelihood (ERL) learning on Gaussian mixture, [4] proposed a new kind of dynamic merge-or-split learning (DMOSL) algorithm on Gaussian mixture such that the number of Gaussians can be determined automatically. [5] presents a fast approximation method, based on kdtrees, reduces both the prediction and the training times of Gaussian process regression, [6] proposed a fast fixed-point learning algorithm for efficiently implementing maximization of the harmony function on Gaussian mixture with automated model selection.…”
Section: Introductionmentioning
confidence: 99%