2003
DOI: 10.1016/s0031-3203(03)00059-1
|View full text |Cite
|
Sign up to set email alerts
|

EM algorithms for Gaussian mixtures with split-and-merge operation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
100
0
23

Year Published

2004
2004
2019
2019

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 139 publications
(123 citation statements)
references
References 14 publications
0
100
0
23
Order By: Relevance
“…However, global optimizers have high computational demands and this approach is limited to moderately sized datasets. Other algorithms proposed to deal with the problem of local maxima include versions of EM with split and merge operations [34,36], a greedy learning method [35], and a component-wise method [10]. Some of these approaches try to estimate simultaneously the number of components K while searching for the optimal set of mixture parameters H.…”
Section: Related Researchmentioning
confidence: 99%
“…However, global optimizers have high computational demands and this approach is limited to moderately sized datasets. Other algorithms proposed to deal with the problem of local maxima include versions of EM with split and merge operations [34,36], a greedy learning method [35], and a component-wise method [10]. Some of these approaches try to estimate simultaneously the number of components K while searching for the optimal set of mixture parameters H.…”
Section: Related Researchmentioning
confidence: 99%
“…As the C passes its peak value, which is detected as its value decreases in the next iteration, we merge the two Gaussians as in [25].…”
Section: Stochastic Exploration Initialization Approachmentioning
confidence: 99%
“…Variational Bayesian methods avoid overfitting but only provide an approximate solution while Stochastic methods are computationally very expensive. Moreover there are two types of EM based methods in which the number of components need not to be fixed in advance: Firstly divisive where the estimate starts from a single component which split into multiple components as the algorithm proceeds [3,4], [25] and secondly agglomerative where the estimate starts from a large number of components which are decreased as the algorithm proceeds [7,8]. A variety of ways have been proposed for spliting or merging GMM components [3], [14], [22], [24,25].…”
Section: Introductionmentioning
confidence: 99%
“…We choose to use the L*u*v* color space for segmentation as in [6,7]. In our case the image color space is represented by a mixture of GMM components.…”
Section: Color Image Segmentationmentioning
confidence: 99%
“…EM has been applied for color image segmentation [6] together with other maximum log-likelihood approaches such as mean shift analysis [7]. In this paper we present a variational approach for color image segmentation.…”
Section: Introductionmentioning
confidence: 99%