IEEE INFOCOM 2021 - IEEE Conference on Computer Communications 2021
DOI: 10.1109/infocom42981.2021.9488723
|View full text |Cite
|
Sign up to set email alerts
|

Sample-level Data Selection for Federated Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 86 publications
(19 citation statements)
references
References 14 publications
0
19
0
Order By: Relevance
“…Note that FedBalancer uses the loss of a sample to measure the statistical utility (and thus the importance) of a sample to the current model, similar to Importance Sampling [45,55]. While other studies have also leveraged gradient norm or gradient norm upper bound [3,30,38] to achieve the same goal, we use loss as it is more widely applicable to FL tasks with non gradient-based optimizations [53].…”
Section: Client Sample Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…Note that FedBalancer uses the loss of a sample to measure the statistical utility (and thus the importance) of a sample to the current model, similar to Importance Sampling [45,55]. While other studies have also leveraged gradient norm or gradient norm upper bound [3,30,38] to achieve the same goal, we use loss as it is more widely applicable to FL tasks with non gradient-based optimizations [53].…”
Section: Client Sample Selectionmentioning
confidence: 99%
“…Moreover, it requires an example dataset, which is hardly applicable at FL scenarios where the client data distributions are usually unknown. Li et al [38] proposes how we can prioritize samples with higher importance in FL using gradient norm upper bound [30]. However, their approach does not provide how many samples should be selected per each round.…”
Section: Related Workmentioning
confidence: 99%
“…We impose no constraint on client selection [40,43,45,52,76,81,88] or training data sampling [44,76] strategies, making it compatible with a mass of recent FL system literature.…”
Section: Federated Learningmentioning
confidence: 99%
“…Integration with the existing FL framework AutoFedNLP's trial groups are compatible with how existing FL frameworks manage clients for training efficiency, a key system component having received high research attention [40,44,45,52,76,81,88]. This is because the adapters and their configuration scheduler are intentionally designed to be decoupled from which device or data will be involved in per-round training.…”
Section: Configurator Algorithm In Detailmentioning
confidence: 99%
See 1 more Smart Citation