2017
DOI: 10.1080/00949655.2017.1327588
|View full text |Cite
|
Sign up to set email alerts
|

Semi-supervised k-means++

Abstract: Traditionally, practitioners initialize the k-means algorithm with centers chosen uniformly at random. Randomized initialization with uneven weights (k-means++) has recently been used to improve the performance over this strategy in cost and run-time. We consider the k-means problem with semi-supervised information, where some of the data are pre-labeled, and we seek to label the rest according to the minimum cost solution. By extending the k-means++ algorithm and analysis to account for the labels, we derive … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
3
2

Relationship

1
9

Authors

Journals

citations
Cited by 42 publications
(23 citation statements)
references
References 9 publications
0
23
0
Order By: Relevance
“…We aim to demonstrate that the NUMA optimizations we deploy for knor are applicable to a variety of compute-intensive applications. The initial phase will target other variants of k-means like spherical k-means [17], semisupervised k-means++ [40] etc. Later phases will target machine learning algorithms like GMM [15], agglomerative clustering [34] and k-nearest neighbors [10].…”
Section: Future Work and Discussionmentioning
confidence: 99%
“…We aim to demonstrate that the NUMA optimizations we deploy for knor are applicable to a variety of compute-intensive applications. The initial phase will target other variants of k-means like spherical k-means [17], semisupervised k-means++ [40] etc. Later phases will target machine learning algorithms like GMM [15], agglomerative clustering [34] and k-nearest neighbors [10].…”
Section: Future Work and Discussionmentioning
confidence: 99%
“…The erroneous GT maps with various patch sizes were used as input for GWENN-SS and ss-Kmeans++. 10 ss-Kmeans++ is a recent semi-supervised extension of Kmeans++ which can also deal with semi-supervised information, i.e. LS sets.…”
Section: Experiments 2: Salinas Hsimentioning
confidence: 99%
“…A DNN-enabled authenticator can be trained in a semi-supervised manner with both the labeled and unlabeled data. This can be done by assigning pseudo labels to the unlabeled samples, which are then exploited as if they are actually labeled [22]- [24]. The pre-trained network can be further fine-tuned by only using the labeled channel observations [25].…”
Section: B Our Workmentioning
confidence: 99%