2016
DOI: 10.1007/978-3-319-46227-1_5
|View full text |Cite
|
Sign up to set email alerts
|

A Split-Merge DP-means Algorithm to Avoid Local Minima

Abstract: We present an extension of the DP-means algorithm, a hardclustering approximation of nonparametric Bayesian models. Although a recent work [6] reports that the DP-means can converge to a local minimum, the condition for the DP-means to converge to a local minimum is still unknown. This paper demonstrates one reason the DP-means converges to a local minimum: the DP-means cannot assign the optimal number of clusters when many data points exist within small distances. As a first attempt to avoid the local minimum… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 17 publications
0
9
0
Order By: Relevance
“…Proof. We show that the objective function (4) monotonically decreases when the k-th cluster center θ k is newly updated toθ k by (9). More specifi-…”
Section: Concave Casementioning
confidence: 87%
See 4 more Smart Citations
“…Proof. We show that the objective function (4) monotonically decreases when the k-th cluster center θ k is newly updated toθ k by (9). More specifi-…”
Section: Concave Casementioning
confidence: 87%
“…which is the weighted mean of x i weighted by f with the Bregman divergence as its argument. However, the objective function monotonically decreases by this update rule (9) only when the function f is concave (or linear). If function f is convex, the cluster center is updated by gradient descent optimization such as the steepest gradient or Newton's method and so on.…”
Section: Derivation Of Update Rulesmentioning
confidence: 99%
See 3 more Smart Citations