2022
DOI: 10.1109/tsipn.2022.3164352
|View full text |Cite
|
Sign up to set email alerts
|

Greedy $k$-Center From Noisy Distance Samples

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1
1
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…We define that given a number of features in a metric space, we wish to find a subset of k features such that the minimum distance between any two features within the subset is maximized. However, this problem is known to be NP-hard [21], making it unsuitable for our real-world applications. R 2 MMT instead uses a greedy implementation of the algorithm proven to be Ω(log k)-competitive with the optimal solution while proving to be significantly faster, especially for larger sets of data [1].…”
Section: Online Unsupervised Domain Adaptationmentioning
confidence: 99%
“…We define that given a number of features in a metric space, we wish to find a subset of k features such that the minimum distance between any two features within the subset is maximized. However, this problem is known to be NP-hard [21], making it unsuitable for our real-world applications. R 2 MMT instead uses a greedy implementation of the algorithm proven to be Ω(log k)-competitive with the optimal solution while proving to be significantly faster, especially for larger sets of data [1].…”
Section: Online Unsupervised Domain Adaptationmentioning
confidence: 99%