2000
DOI: 10.1016/s0167-8655(00)00031-3
|View full text |Cite
|
Sign up to set email alerts
|

Learning mixture models using a genetic version of the EM algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2006
2006
2018
2018

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 36 publications
(33 citation statements)
references
References 11 publications
0
33
0
Order By: Relevance
“…Exceptions where both mean vectors and full covariance matrices were used include [4,5] where EM was used for the actual local optimization by fitting Gaussians to data in each iteration and GA was used only to guide the global search by selecting individual Gaussian components from existing candidate solutions in the reproduction steps. However, treating each Gaussian component as a whole in the search process and fitting it locally using the EM iterations may not explore the whole solution space effectively especially in higher dimensions.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Exceptions where both mean vectors and full covariance matrices were used include [4,5] where EM was used for the actual local optimization by fitting Gaussians to data in each iteration and GA was used only to guide the global search by selecting individual Gaussian components from existing candidate solutions in the reproduction steps. However, treating each Gaussian component as a whole in the search process and fitting it locally using the EM iterations may not explore the whole solution space effectively especially in higher dimensions.…”
Section: Related Workmentioning
confidence: 99%
“…These approaches enable multiple candidate solutions to simultaneously converge to possibly different optima by making use of the interactions. Genetic algorithm (GA) [3][4][5][6][7], differential evolution (DE) [8], and particle swarm optimization (PSO) [9][10][11][12] have been the most common population-based stochastic search algorithms used for the estimation of some form of GMMs. Even though these approaches have been shown to perform better than non-stochastic alternatives such as k-means and fuzzy c-means, the interaction mechanism that forms the basis of the power of the stochastic search algorithms has also limited the use of these methods due to some inherent assumptions in the candidate solution parametrization.…”
Section: Introductionmentioning
confidence: 99%
“…We use a kind of genetic EM (GA-EM) (Martínez and Virtriá 2000;Pernkopf and Bouchaffra 2005) combining EM with a Genetic Algorithm (GA) for estimating the models. A GA works by simulating evolution in natural systems, by initialization of a population of individuals and then iteratively performing selection of the fittest individuals and reproduction until a termination criterion is met.…”
Section: Mixture Model Estimationmentioning
confidence: 99%
“…GA-EM estimation therefore attempts to fix the convergence problem present in EM estimation. In contrast to earlier GA-EM algorithms (Martínez and Virtriá 2000;Pernkopf and Bouchaffra 2005), we perform the GA-EM using a single mixture model and treat the mixture model components as the individuals for GA.…”
Section: Mixture Model Estimationmentioning
confidence: 99%
“…More generic techniques like deterministic annealing [31], [35] and genetic algorithms [23], [17] have been applied to obtain a good set of parameters. Although these techniques have asymptotic guarantees, they are very time consuming and, hence, cannot be used in most practical applications.…”
Section: Relevant Backgroundmentioning
confidence: 99%