2010
DOI: 10.1007/s00493-010-2302-z
|View full text |Cite
|
Sign up to set email alerts
|

Maximum gradient embeddings and monotone clustering

Abstract: Let (X, d X ) be an n-point metric space. We show that there exists a distribution D over non-contractive embeddings into trees f : X → T such that for every x ∈ X,where C is a universal constant. Conversely we show that the above quadratic dependence on log n cannot be improved in general. Such embeddings, which we call maximum gradient embeddings, yield a framework for the design of approximation algorithms for a wide range of clustering problems with monotone costs, including fault-tolerant versions of k-me… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2010
2010
2018
2018

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 41 publications
(101 reference statements)
0
1
0
Order By: Relevance
“…Important contributions of Calinescu, Karloff and Rabani [12] and Fakcharoenphol, Rao and Talwar [22] resulted in a sharp form of "Bartal's random tree method", and our work builds on these ideas. In [36,37] such random ultrametrics were used in order to prove maximal-type inequalities of a very different nature (motivated by embedding problems, as ultrametrics are isometric to subsets of Hilbert space [29]); these results also served as some inspiration for our work.…”
Section: 2mentioning
confidence: 99%
“…Important contributions of Calinescu, Karloff and Rabani [12] and Fakcharoenphol, Rao and Talwar [22] resulted in a sharp form of "Bartal's random tree method", and our work builds on these ideas. In [36,37] such random ultrametrics were used in order to prove maximal-type inequalities of a very different nature (motivated by embedding problems, as ultrametrics are isometric to subsets of Hilbert space [29]); these results also served as some inspiration for our work.…”
Section: 2mentioning
confidence: 99%