2021
DOI: 10.48550/arxiv.2102.07623
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FedBN: Federated Learning on Non-IID Features via Local Batch Normalization

Abstract: The emerging paradigm of federated learning (FL) strives to enable collaborative training of deep models on the network edge without centrally aggregating raw data and hence improving data privacy. In most cases, the assumption of independent and identically distributed samples across local clients does not hold for federated learning setups. Under this setting, neural network training performance may vary significantly according to the data distribution and even hurt training convergence. Most of the previous… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
89
0
3

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 60 publications
(93 citation statements)
references
References 25 publications
1
89
0
3
Order By: Relevance
“…Communication efficiency, system heterogeneity, statistical heterogeneity, and privacy are all major issues in FL [16]. To reduce the communication costs in FL, some studies propose using data compression techniques [3,17,18], adding regularization terms for local optimization [6,19], and developing FL counterparts of Batch Normalization [7,20,21]. Moreover, the use of local momentum and global momentum [22] have been shown to facilitate faster convergence.…”
Section: Related Workmentioning
confidence: 99%
“…Communication efficiency, system heterogeneity, statistical heterogeneity, and privacy are all major issues in FL [16]. To reduce the communication costs in FL, some studies propose using data compression techniques [3,17,18], adding regularization terms for local optimization [6,19], and developing FL counterparts of Batch Normalization [7,20,21]. Moreover, the use of local momentum and global momentum [22] have been shown to facilitate faster convergence.…”
Section: Related Workmentioning
confidence: 99%
“…However, in the federated setting, data is stored locally and cannot be shared. Recently, federated batch normalization [3] and federated adversarial domain adaptation [2], [9] have been proposed to deal with DA under the privacy-preserving requirement. The work by Li et al feature shift, i.e., the deviation in feature space, using batch normalization before averaging the local models.…”
Section: B Domain Adaptationmentioning
confidence: 99%
“…To cope with such diversity, recent works [2], [3] have proposed to integrate Unsupervised Domain Adaptation (UDA) into the FL framework. UDA methods force the model to learn domain-agnostic features through adversarial learning [2] or a specific type of batch normalization [3]. In this work, we follow an UDA adversarial approach to handle non-IID data.…”
mentioning
confidence: 99%
“…Solutions to overcome the non-iid challenge thus need to be developed, e.g., creating an additional subset of datasets to allocate fairly among clients [37], aiming to ensure efficient data training in FL-based smart healthcare. Another promising approach is to implement the feature shift among heterogeneous clients [127], by using local batch normalization to adjust the feature distributions at the client side before averaging the local models. Quantitative metrics are needed to assess non-iid data in the FL-based smart healthcare sector, such as standard deviation, precision, and accuracy with respect to label/feature distribution skew and homogeneous partitions [128].…”
Section: Non-iidness and Data Quality In Fl-based Smart Healthcarementioning
confidence: 99%