2022
DOI: 10.1016/j.neucom.2021.08.141
|View full text |Cite
|
Sign up to set email alerts
|

FedSim: Similarity guided model aggregation for Federated Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 51 publications
(9 citation statements)
references
References 26 publications
0
9
0
Order By: Relevance
“…Due to the multiple local updates and consistency between the targeted distribution, the authors' proposed unbiased gradient variant [60] aggregation method was used, and using this method, faster convergence and higher accuracy were achieved. The author of [72] proposed a FedSim aggregation model for local and global aggregation. In this method, at first, similar gradients were clustered using local aggregation and then globally aggregated for better convergence and variance.…”
Section: Aggregation Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Due to the multiple local updates and consistency between the targeted distribution, the authors' proposed unbiased gradient variant [60] aggregation method was used, and using this method, faster convergence and higher accuracy were achieved. The author of [72] proposed a FedSim aggregation model for local and global aggregation. In this method, at first, similar gradients were clustered using local aggregation and then globally aggregated for better convergence and variance.…”
Section: Aggregation Methodsmentioning
confidence: 99%
“…Aggregation:The main goal of the aggregation method is to help increase the learning rate of the global model from distributed clients. Reduction in communication rounds, multi-centre aggregation [64], conditions for aggregation [61], robust schemes for aggregation [63,65], matching with similar clients prior to aggregation [68], aggregation with similar models [72], asynchronous aggregation [160], Effective aggregation rules on the server side [161], and energy consumption during aggregation are some of the pertinent future directions to be considered with respect to aggregation design in federated learning.…”
Section: Challenges and Future Direction Of Research In Design Of Fed...mentioning
confidence: 99%
“…Federated learning is a distributed machine learning approach in which clients learn global models in a privacypreserving way, taking into account both information sharing and privacy protection [28]. e research on federal learning in cross-border e-commerce open innovation is very rare and still blank.…”
Section: Realization Mode Of Open Innovation Enterprise Wishing To Ga...mentioning
confidence: 99%
“…Based on Federated Learning Federated learning, a distributed machine learning paradigm, allows training models of scattered data on largescale edge or mobile devices without the need to collect raw data [30], which effectively mitigates unnecessary bandwidth loss, and enhances data privacy and legitimizing [31]. Palihawadana et al [28] performed local clustering for customers with similar gradients, and then conducted further global aggregation. Lee and Lee [32] aggregated the strategies of each system into central strategies to speed up learning.…”
Section: Intelligent Optimization Model Of Cross-border E-commerce Op...mentioning
confidence: 99%
“…It can solve the problem of data silos very well. In the current mainstream federated averaging algorithm [4] (FAVG), an ML model is trained locally with the user's private data, and then the model parameters are uploaded. The central server aggregates the uploaded parameters and updates the global model.…”
Section: Introductionmentioning
confidence: 99%