2021
DOI: 10.1038/s41598-021-93030-0
|View full text |Cite
|
Sign up to set email alerts
|

Medical imaging deep learning with differential privacy

Abstract: The successful training of deep learning models for diagnostic deployment in medical imaging applications requires large volumes of data. Such data cannot be procured without consideration for patient privacy, mandated both by legal regulations and ethical requirements of the medical profession. Differential privacy (DP) enables the provision of information-theoretic privacy guarantees to patients and can be implemented in the setting of deep neural network training through the differentially private stochasti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
38
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 87 publications
(38 citation statements)
references
References 23 publications
0
38
0
Order By: Relevance
“…Other, more sophisticated methods like the sparse vector technique (SVT) combine the addition of noise with value clipping and/or removal of parts of the data [24]. DP has been successfully integrated into the training of deep neural networks [25]- [27] and used for FL applications [15], [28]- [31]. DP can make strong guarantees about the theoretical effectiveness of protecting against individual privacy loss but is difficult to tune to keep the model accuracy at an acceptable level [32].…”
Section: Differential Privacy (Dp)mentioning
confidence: 99%
“…Other, more sophisticated methods like the sparse vector technique (SVT) combine the addition of noise with value clipping and/or removal of parts of the data [24]. DP has been successfully integrated into the training of deep neural networks [25]- [27] and used for FL applications [15], [28]- [31]. DP can make strong guarantees about the theoretical effectiveness of protecting against individual privacy loss but is difficult to tune to keep the model accuracy at an acceptable level [32].…”
Section: Differential Privacy (Dp)mentioning
confidence: 99%
“…Therefore, to prevent the possibility of inferring personal characteristics from the data, further techniques need to be employed, such as differential privacy (Dwork and Roth, 2014). Differential privacy techniques work by injecting a controlled amount of statistical noise to obscure the data contributions from individuals in the dataset (Dwork and Roth, 2014), and has been applied to medical imaging (Ziller et al, 2021). This is performed while ensuring that the model still gains insight into the overall population, and thus provides predictions that are accurate enough to be useful.…”
Section: Data Sharing and Data Privacymentioning
confidence: 99%
“…Past works have noted the potential solution DP provides for machine learning in the healthcare domain. Kaissis et al 1 surveyed privacy-preservation techniques to be used in conjunction with machine learning, which were then implemented for classifying chest X-rays and segmenting CT scans 24 , 25 . In histopathology, Lu et al 13 reported DP guarantees for a neural network classifier trained with FL, following Li et al 26 .…”
Section: Introductionmentioning
confidence: 99%