2022
DOI: 10.1609/aaai.v36i8.20819
|View full text |Cite
|
Sign up to set email alerts
|

FedProto: Federated Prototype Learning across Heterogeneous Clients

Abstract: Heterogeneity across clients in federated learning (FL) usually hinders the optimization convergence and generalization performance when the aggregation of clients' knowledge occurs in the gradient space. For example, clients may differ in terms of data distribution, network latency, input/output space, and/or model architecture, which can easily lead to the misalignment of their local gradients. To improve the tolerance to heterogeneity, we propose a novel federated prototype learning (FedProto) framework in … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
102
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 256 publications
(102 citation statements)
references
References 23 publications
0
102
0
Order By: Relevance
“…Herein, we empirically used the network that removed the last fully connected layer as the feature extractor [ 24 ].…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Herein, we empirically used the network that removed the last fully connected layer as the feature extractor [ 24 ].…”
Section: Methodsmentioning
confidence: 99%
“…Inspired by prototype learning, in which gathering the prototypes across heterogeneous datasets enables the incorporation of feature representations over various data distributions [ 24 , 25 ], we brought a global prototype ( ) aggregated across clients as consistent feature-level views to guide local training. We propose a new prototype guidance regularisation (PGReg) loss as follows: where denotes the global prototype of the -th category and denotes the L2 distance.…”
Section: Methodsmentioning
confidence: 99%
“…FedAMP and HeurFedAMP [29] are novel attentive message passing mechanism to significantly facilitate the collaboration effectiveness between clients without infringing their data privacy which enables similar clients to have stronger collaboration than clients with dissimilar models, and this mechanism significantly improves the learning performance. In framework FedProto [30], the clients and server communicate the abstract class prototypes instead of the gradients, and aggregates the local prototypes collected from different clients, and then sends the global prototypes back to all clients to regularize the training of local models. The training on each client aims to minimize the classification error on the local data while keeping the resulting local prototypes sufficiently close to the corresponding global ones.…”
Section: Related Workmentioning
confidence: 99%
“…subgraph [32]). Graph contrastive learning also benefits diverse applications, such as chemical prediction [47], anomaly detection [18,24], federated learning [36,56], and recommendation [54]. However, it still remains unclear how to effectively improve GSL using contrastive learning.…”
Section: Contrastive Learning On Graphsmentioning
confidence: 99%