2021
DOI: 10.48550/arxiv.2102.04050
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

A Constant Approximation Algorithm for Sequential Random-Order No-Substitution k-Median Clustering

Abstract: We study k-median clustering under the sequential no-substitution setting. In this setting, a data stream is sequentially observed, and some of the points are selected by the algorithm as cluster centers. However, a point can be selected as a center only immediately after it is observed, before observing the next point. In addition, a selected center cannot be substituted later. We give a new algorithm for this setting that obtains a constant approximation factor on the optimal risk under a random arrival orde… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
9
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(9 citation statements)
references
References 15 publications
0
9
0
Order By: Relevance
“…When all the points are removed, the algorithm stops and calculates the final clustering from the centers selected in the centralized clustering runs. SOCCER combines clustering approaches designed for two different settings: The distributed setting (Ene et al, 2011) and the online setting (Hess et al, 2021). Ene et al (2011) iteratively samples points from the machines and then removes points that are close to them from consideration.…”
Section: Introductionmentioning
confidence: 99%
See 4 more Smart Citations
“…When all the points are removed, the algorithm stops and calculates the final clustering from the centers selected in the centralized clustering runs. SOCCER combines clustering approaches designed for two different settings: The distributed setting (Ene et al, 2011) and the online setting (Hess et al, 2021). Ene et al (2011) iteratively samples points from the machines and then removes points that are close to them from consideration.…”
Section: Introductionmentioning
confidence: 99%
“…Ene et al (2011) iteratively samples points from the machines and then removes points that are close to them from consideration. We show that calculating a clustering on the point sample, along with a technique adapted from Hess et al (2021), lead to a more accurate removal of points. This provides a practical and successful algorithm with approximation guarantees that depend only on the number of points that the coordinator can cluster.…”
Section: Introductionmentioning
confidence: 99%
See 3 more Smart Citations