2014
DOI: 10.1007/978-3-319-08404-6_5
|View full text |Cite
|
Sign up to set email alerts
|

New Approximability Results for the Robust k-Median Problem

Abstract: We consider a robust variant of the classical k-median problem, introduced by Anthony et al. [2]. In the Robust k-Median problem, we are given an n-vertex metric space (V, d) and m client setsThe objective is to open a set F ⊆ V of k facilities such that the worst case connection cost over all client sets is minimized; in other words, minimize max i v∈Si d (F, v). Anthony et al. showed an O(log m) approximation algorithm for any metric and APX-hardness even in the case of uniform metric. In this paper, we show… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
9
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 18 publications
1
9
0
Order By: Relevance
“…In this setting, Theorem 1.5 gives a constant factor approximation which is essentially optimal. This bound implies an O ( log n log log n ) 1/p -approximation for (p, ∞)-Fair Clustering 2 , which matches the recent approximation algorithm of [Makarychev and Vakilian, 2021] and the hardness result of [Bhattacharya et al, 2014]. Thus, for any value of p, the approximation guarantee of Theorem 1.5 smoothly interpolates between the optimal approximation bounds for the previously studied special cases of q = ∞ and q = p.…”
Section: Our Results and Techniquessupporting
confidence: 79%
“…In this setting, Theorem 1.5 gives a constant factor approximation which is essentially optimal. This bound implies an O ( log n log log n ) 1/p -approximation for (p, ∞)-Fair Clustering 2 , which matches the recent approximation algorithm of [Makarychev and Vakilian, 2021] and the hardness result of [Bhattacharya et al, 2014]. Thus, for any value of p, the approximation guarantee of Theorem 1.5 smoothly interpolates between the optimal approximation bounds for the previously studied special cases of q = ∞ and q = p.…”
Section: Our Results and Techniquessupporting
confidence: 79%
“…(Bar-Yossef et al, 2002;Feigenbaum et al, 2005;Baswana, 2008;Kelner and Levin, 2011;Ahn et al, 2012b,a;Goel et al, 2012b,a;Ahn et al, 2012a;Baswana et al, 2012;Crouch et al, 2013;Ahn et al, 2013;McGregor, 2014;Baswana et al, 2015;Bhattacharya et al, 2015;Bernstein and Stein, 2016;Abraham et al, 2016a,b;Boutsidis et al, 2016;Kapralov et al, 2017;Song et al, 2017a,b). In addition, k-means and k-median were studied in various different settings, e.g., (Charikar et al, 1998;Indyk and Price, 2011;Backurs et al, 2016;Bhattacharya et al, 2014;Sohler and Woodruff, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…While k-MEANS, k-MEDIAN, and k-CENTER admit constant-factor approximations, it is not very surprising that ROBUST (k, z)-CLUSTERING is harder due to its generality: Makarychev and Vakilian [103] design a polynomial-time O (log m/ log log m)-approximation algorithm, which is tight under a standard complexity assumption [24]. As this precludes the existence of efficient constant-factor approximation algorithms, recent works have focused on designing constant-factor parameterized approximation algorithms.…”
Section: Robust Clustering In Discrete Geometric Spacesmentioning
confidence: 99%