2019 IEEE Global Communications Conference (GLOBECOM) 2019
DOI: 10.1109/globecom38437.2019.9013625
|View full text |Cite
|
Sign up to set email alerts
|

Robust Coreset Construction for Distributed Machine Learning

Abstract: Motivated by the need of solving machine learning problems over distributed datasets, we explore the use of coreset to reduce the communication overhead. Coreset is a summary of the original dataset in the form of a small weighted set in the same sample space. Compared to other data summaries, coreset has the advantage that it can be used as a proxy of the original dataset, potentially for different applications. However, existing coreset construction algorithms are each tailor-made for a specific machine lear… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
3
1
1

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 36 publications
0
2
0
Order By: Relevance
“…To solve the problem defined in (1)- (2), in this section we propose L-FGADMM, by extending GADMM proposed in our prior work [8]. Following GADMM (see Fig.…”
Section: Proposed Algorithm: L-fgadmmmentioning
confidence: 99%
See 1 more Smart Citation
“…To solve the problem defined in (1)- (2), in this section we propose L-FGADMM, by extending GADMM proposed in our prior work [8]. Following GADMM (see Fig.…”
Section: Proposed Algorithm: L-fgadmmmentioning
confidence: 99%
“…Interest in data-driven machine learning (ML) is on the rise, but difficulties in securing data still remain [1], [2]. Mission critical applications aggravate this challenge, which require a large volume of up-to-date data for timely coping with local environments even under extreme events.…”
Section: Introductionmentioning
confidence: 99%