2022
DOI: 10.48550/arxiv.2210.02680
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing

Abstract: Federated learning (FL) strives to enable privacy-preserving training of machine learning models without centrally collecting clients' private data. Despite its advantages, the local datasets across clients in FL are non-independent and identically distributed (non-IID), and the data-owning clients may drop out of the training process arbitrarily. These characteristics will significantly degrade the training performance. Therefore, we propose a Dropout-Resilient Secure Federated Learning (DReS-FL) framework ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 30 publications
0
1
0
Order By: Relevance
“…As one of the important frameworks to distributedly train machine learning model, federated learning has also been combined with coding techniques in recent studies, such as CodedFedL [30], CodedPaddedFL [31] and DRes-FL [32].…”
Section: Related Workmentioning
confidence: 99%
“…As one of the important frameworks to distributedly train machine learning model, federated learning has also been combined with coding techniques in recent studies, such as CodedFedL [30], CodedPaddedFL [31] and DRes-FL [32].…”
Section: Related Workmentioning
confidence: 99%