2020
DOI: 10.1007/978-3-030-63076-8_11
|View full text |Cite
|
Sign up to set email alerts
|

A Principled Approach to Data Valuation for Federated Learning

Abstract: Federated Learning (FL), a distributed learning paradigm that scales on-device learning collaboratively, has emerged as a promising approach for decentralized AI applications. Local optimization methods such as Federated Averaging (FedAvg) are the most prominent methods for FL applications. Despite their simplicity and popularity, the theoretical understanding of local optimization methods is far from clear. This dissertation aims to advance the theoretical foundation of local methods in the following three di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
73
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 95 publications
(73 citation statements)
references
References 102 publications
0
73
0
Order By: Relevance
“…To leverage SV for measuring FL participant contributions, current research focuses is on improving the eiciency of SV approximation while maintaining accuracy. Existing works in this domain can be divided into two main categories: 1) reducing the number of sub-model evaluations required [7,9,25], and 2) accelerating single round of evaluation [20,25]. To accelerate a single round of evaluation, FL can utilize participants' gradient updates to reconstruct various FL sub-models instead of re-training each variant of the model from scratch.…”
Section: A-b-c B-c-a C-a-b A-c-b C-b-a B-a-c Svmentioning
confidence: 99%
See 3 more Smart Citations
“…To leverage SV for measuring FL participant contributions, current research focuses is on improving the eiciency of SV approximation while maintaining accuracy. Existing works in this domain can be divided into two main categories: 1) reducing the number of sub-model evaluations required [7,9,25], and 2) accelerating single round of evaluation [20,25]. To accelerate a single round of evaluation, FL can utilize participants' gradient updates to reconstruct various FL sub-models instead of re-training each variant of the model from scratch.…”
Section: A-b-c B-c-a C-a-b A-c-b C-b-a B-a-c Svmentioning
confidence: 99%
“…However, they still require evaluating every possible FL sub-model as in the original SV setting. In [25], the authors proposed the Federated SV approach with two acceleration techniques for within-round SV estimation: 1) random permutation sampling and 2) group testing. However, the number of random sampling is ixed and not tailored to each permutation.…”
Section: Fl Participant Contribution Evaluation Via Gradient Shapley ...mentioning
confidence: 99%
See 2 more Smart Citations
“…Within a FL framework, the unique nature of having the data separated entails concern about an equitable distribution of profit. Shapley values are a wellestablished method for determination of equitable payoff, yet in FL they pose a computationally prohibitive and horizontally limited applicability due in part to its lack of ability to be extrapolated (Huang et al, 2020;Wang et al, 2020). Yet, the nature of FL does warrant an evaluation of the utility it poses as well as any derivative models that can be created such as FedCoin (Liu et al, 2020).…”
Section: Contribution Equitymentioning
confidence: 99%