2018
DOI: 10.1109/lsp.2018.2847908
|View full text |Cite
|
Sign up to set email alerts
|

Quantized Compressive K-Means

Abstract: The recent framework of compressive statistical learning proposes to design tractable learning algorithms that use only a heavily compressed representation-or sketch-of massive datasets. Compressive K-Means (CKM) is such a method: it aims at estimating the centroids of data clusters from pooled, non-linear, random signatures of the learning examples. While this approach significantly reduces computational time on very large datasets, its digital implementation wastes acquisition resources because the learning … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 28 publications
0
12
0
Order By: Relevance
“…Although we focused on k-means clustering, other learning tasks can be solved in a compressive manner and should be investigated, such as Gaussian mixtures fitting or principal components analysis. We leave for future work the idea of using quantized sketches [26] (quantization for privacy has already been considered [27,28]), and leveraging fast transforms to speed-up the process [29]. Using additive noise on the data samples themselves is also a possibility that should be investigated.…”
Section: Resultsmentioning
confidence: 99%
“…Although we focused on k-means clustering, other learning tasks can be solved in a compressive manner and should be investigated, such as Gaussian mixtures fitting or principal components analysis. We leave for future work the idea of using quantized sketches [26] (quantization for privacy has already been considered [27,28]), and leveraging fast transforms to speed-up the process [29]. Using additive noise on the data samples themselves is also a possibility that should be investigated.…”
Section: Resultsmentioning
confidence: 99%
“…[25] Shape recognition Fuzzy k-means clustering ensemble (FKMCE). [26] Signal processing Compressive k-means clustering (CKM).…”
Section: Referencementioning
confidence: 99%
“…The effect of dithering is to make the quantized Φ q behave similarly to non-quantized Φ RF on average. For instance, it was shown in [43] that for each W , x, x , and ξ,…”
Section: Sketching With Quantized Contributionsmentioning
confidence: 99%