2022
DOI: 10.1101/2022.07.28.22277288
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Encrypted federated learning for secure decentralized collaboration in cancer image analysis

Abstract: Artificial Intelligence (AI) has a multitude of applications in cancer research and oncology. However, the training of AI systems is impeded by the limited availability of large datasets due to data protection requirements and other regulatory obstacles. Federated and swarm learning represent possible solutions to this problem by collaboratively training AI models while avoiding data transfer. However, in these decentralized methods, weight updates are still transferred to the aggregation server for merging th… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 48 publications
0
4
0
Order By: Relevance
“…As WSI deals with large amounts of sensitive patient information, strong data privacy policies, encryption, and access control are indispensable. Differential privacy methods such as encryption and federated learning protect individual privacy while allowing accurate data analysis (8789).…”
Section: Discussionmentioning
confidence: 99%
“…As WSI deals with large amounts of sensitive patient information, strong data privacy policies, encryption, and access control are indispensable. Differential privacy methods such as encryption and federated learning protect individual privacy while allowing accurate data analysis (8789).…”
Section: Discussionmentioning
confidence: 99%
“…In addition, Jupyter-Notebook-based tools, such as [24], also help simplify the FL setup and enable its deployment of a cross-country federated environment in only a few minutes. Daniel Truhn in [25] employed homomorphic encryption to protect the model's performance while training by encrypting the weight updates before sharing them with the central server. Firas Khader in [26] presented a technique of "learnable synergy", where the model only chooses pertinent interactions between data modalities and maintains an"internal memory" of key information.…”
Section: Federated Learning (Fl)mentioning
confidence: 99%
“…They only received an aggregate network without any information on the contributions of other participating institutions to the global network. Following the convergence of the training phase for the global classification network, each institution had the opportunity to retain a copy of the global network for local utilization on their respective test data 12,14 .…”
Section: Federated Learningmentioning
confidence: 99%
“…and performant, i.e., generalizing AI models. Federated learning (FL) [9][10][11][12][13][14] , particularly the Federated Averaging (FedAvg) 11 algorithm, presents a promising solution. This approach allows AI models to be collaboratively trained across various sites without data exchange, thereby preserving data privacy.…”
mentioning
confidence: 99%