2022
DOI: 10.36227/techrxiv.20004140.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FSSA: Efficient 3-Round Secure Aggregation for Privacy-Preserving Federated Learning

Abstract: <p>Federated learning (FL) allows a large number of clients to collaboratively train machine learning (ML) models by sending only their local gradients to a central server for aggregation in each training iteration, without sending their raw training data.  This paper proposes a 3-round secure aggregation protocol, that is effificient in terms of computation and communication, and resilient to client dropouts.</p>

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 42 publications
(117 reference statements)
0
1
0
Order By: Relevance
“…We will focus on use cases such as proof of learning and federated learning. For instance, we will examine how blockchain has been used for secure and privacy-preserving federated learning [24], federated learning for autonomous vehicles [25], and healthcare applications [26]. We will also explore the novel blockchain consensus mechanism, Proof of Learning, based on ML competitions [27].…”
Section: ) Review Of Existing Consensus Mechanisms For MLmentioning
confidence: 99%
“…We will focus on use cases such as proof of learning and federated learning. For instance, we will examine how blockchain has been used for secure and privacy-preserving federated learning [24], federated learning for autonomous vehicles [25], and healthcare applications [26]. We will also explore the novel blockchain consensus mechanism, Proof of Learning, based on ML competitions [27].…”
Section: ) Review Of Existing Consensus Mechanisms For MLmentioning
confidence: 99%