2021
DOI: 10.21203/rs.3.rs-1005694/v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Federated Learning and Differential Privacy for Medical Image Analysis

Abstract: The artificial intelligence revolution has been spurred forward by the availability of large-scale datasets. In contrast, the paucity of large-scale medical datasets hinders the application of machine learning in healthcare. The lack of publicly available multi-centric and diverse datasets mainly stems from confidentiality and privacy concerns around sharing medical data. To demonstrate a feasible path forward in medical image imaging, we conduct a case study of applying a differentially private federated lear… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 56 publications
(22 citation statements)
references
References 21 publications
0
22
0
Order By: Relevance
“…Other, more sophisticated methods like the sparse vector technique (SVT) combine the addition of noise with value clipping and/or removal of parts of the data [24]. DP has been successfully integrated into the training of deep neural networks [25]- [27] and used for FL applications [15], [28]- [31]. DP can make strong guarantees about the theoretical effectiveness of protecting against individual privacy loss but is difficult to tune to keep the model accuracy at an acceptable level [32].…”
Section: Differential Privacy (Dp)mentioning
confidence: 99%
“…Other, more sophisticated methods like the sparse vector technique (SVT) combine the addition of noise with value clipping and/or removal of parts of the data [24]. DP has been successfully integrated into the training of deep neural networks [25]- [27] and used for FL applications [15], [28]- [31]. DP can make strong guarantees about the theoretical effectiveness of protecting against individual privacy loss but is difficult to tune to keep the model accuracy at an acceptable level [32].…”
Section: Differential Privacy (Dp)mentioning
confidence: 99%
“…Differential privacy in federated learning is often achieved using differentially-private stochastic gradient descent (DP-SGD) [7,41,42], an algorithm that determines the appropriate noise scale and how to clip the model parameter. The combination of federated learning and differential privacy has been explored in multiple medical use cases including prediction of mortality and adverse drug reactions from electronic health records [43], brain tumor segmentation [9], classification of pathology whole slide images [20], detection of diabetic retinopathy in images of the retina [44], and identification of lung cancer in histopathologic images [45].…”
Section: Related Workmentioning
confidence: 99%
“…It accounts for 11.4% of all cancer-related incidence and accounts for 18% of global cancer-related death [185]. Adnan et al [190] studied the effects of IID and non-IID distributions along with the healthcare providers by conducting a case study. They proposed a deferentially private FL framework as the potential solution to analyze histopathology images.…”
Section: F Fl In Medical Image Analysismentioning
confidence: 99%