2023
DOI: 10.1109/tnnls.2021.3129371
|View full text |Cite
|
Sign up to set email alerts
|

FedAUX: Leveraging Unlabeled Auxiliary Data in Federated Learning

Abstract: Federated distillation (FD) is a popular novel algorithmic paradigm for Federated learning (FL), which achieves training performance competitive to prior parameter averaging-based methods, while additionally allowing the clients to train different model architectures, by distilling the client predictions on an unlabeled auxiliary set of data into a student model. In this work, we propose FEDAUX, an extension to FD, which, under the same set of assumptions, drastically improves the performance by deriving maxim… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
32
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 63 publications
(43 citation statements)
references
References 61 publications
0
32
0
Order By: Relevance
“…Currently, most one-shot federated learning methods [9]- [14] rely on Knowledge Distillation [15] or Dataset Distillation [16]. The fundamental idea is to use public data sets or the distilled synthetic data from clients to do model distillation on the server, where the local models play as teachers.…”
Section: Fl One-shot Fl Ours One Round No Public Data Setmentioning
confidence: 99%
See 1 more Smart Citation
“…Currently, most one-shot federated learning methods [9]- [14] rely on Knowledge Distillation [15] or Dataset Distillation [16]. The fundamental idea is to use public data sets or the distilled synthetic data from clients to do model distillation on the server, where the local models play as teachers.…”
Section: Fl One-shot Fl Ours One Round No Public Data Setmentioning
confidence: 99%
“…How to aggregate the knowledge of multiple models into a single model without acquiring their original training datasets is still an open problem. Federated Learning (FL) [4]- [9] was recently proposed as a solution to this challenge and has seen remarkable growth. FL introduces a new machine learning paradigm that allows it to learn from distributed…”
Section: Introductionmentioning
confidence: 99%
“…The labels can be expensive and time-consuming to mark, and even require domain expertise [7]. The label can also contain sensitive information, which will incur privacy leakage if not handled properly, especially under the vertically partitioned setting where the labels need to be shared among all clients [8].…”
Section: Introductionmentioning
confidence: 99%
“…FedAUX overcomes that challenge, performing remarkably more efficiently on non-iid data than other state-of-the-art federated learning methods, federated averaging, federated proximal learning, Bayesian federated learning, and federated ensemble distillation. For example on MobilenetV2, FedAUX achieves 64.8% server accuracy, while even the second-best method only achieves 46.7% [Sattler et al, 2021a].…”
Section: Introductionmentioning
confidence: 99%
“…Both governments and private institutions are increasingly interested in securing their data using differential privacy. [Sattler et al, 2021a] train two models on private data, but only privatize the scoring model, leaving the data participating in the classification model exposed. In this work, we add a local, data-level ( , δ)-differentially private mechanism for this second model and give an upper bound on the L 2 -sensitivity of regularized multinomial logistic regression.…”
Section: Introductionmentioning
confidence: 99%