2023
DOI: 10.48550/arxiv.2303.13137
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FedGH: Heterogeneous Federated Learning with Generalized Global Header

Abstract: Federated learning (FL) is an emerging machine learning paradigm that allows multiple parties to train a shared model collaboratively in a privacy-preserving manner. Existing horizontal FL methods generally assume that the FL server and clients hold the same model structure. However, due to system heterogeneity and the need for personalization, enabling clients to hold models with diverse structures has become an important direction. Existing model-heterogeneous FL approaches often require publicly available d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…This method decouples the model's parameters, allowing only a subset of parameters to be shared and aggregated among clients, while the private parameters are learned solely on local data. Previous research has attempted to select these private parameters based on model architecture [41]- [44] or data similarities [45]- [47]. However, none of these methods have been proposed specifically for training personalized transformer models, which have a significantly different structure compared to other machine learning models.…”
Section: Personalized Federated Learningmentioning
confidence: 99%
“…This method decouples the model's parameters, allowing only a subset of parameters to be shared and aggregated among clients, while the private parameters are learned solely on local data. Previous research has attempted to select these private parameters based on model architecture [41]- [44] or data similarities [45]- [47]. However, none of these methods have been proposed specifically for training personalized transformer models, which have a significantly different structure compared to other machine learning models.…”
Section: Personalized Federated Learningmentioning
confidence: 99%
“…However, existing prototype-based HtFL methods naively aggregate heterogeneous client prototypes on the server using weighted-averaging, which has several limitations. First, the weighted-averaging protocol requires clients to upload class distribution information of private data to the server as weights, which leaks sensitive distribution information about clients' data (Yi et al 2023). Secondly, the prototypes generated from heterogeneous clients have diverse scales and separation margins.…”
Section: Introductionmentioning
confidence: 99%