2022
DOI: 10.3390/s22145195
|View full text |Cite
|
Sign up to set email alerts
|

Defending against Reconstruction Attacks through Differentially Private Federated Learning for Classification of Heterogeneous Chest X-ray Data

Abstract: Privacy regulations and the physical distribution of heterogeneous data are often primary concerns for the development of deep learning models in a medical context. This paper evaluates the feasibility of differentially private federated learning for chest X-ray classification as a defense against data privacy attacks. To the best of our knowledge, we are the first to directly compare the impact of differentially private training on two different neural network architectures, DenseNet121 and ResNet50. Extendin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 56 publications
0
7
0
Order By: Relevance
“…Even more, we find that the trade-off between privacy risks and model performance vanishes when using such large but protective privacy budgets. It is known from previous works 23,[36][37][38] that PETs formally protect AI models in sensitive contexts from reconstruction attacks. While we note that our results are empirical, it is apparent that DP training with minimal guarantees still provides better protection than non-private training.…”
Section: Discussionmentioning
confidence: 99%
“…Even more, we find that the trade-off between privacy risks and model performance vanishes when using such large but protective privacy budgets. It is known from previous works 23,[36][37][38] that PETs formally protect AI models in sensitive contexts from reconstruction attacks. While we note that our results are empirical, it is apparent that DP training with minimal guarantees still provides better protection than non-private training.…”
Section: Discussionmentioning
confidence: 99%
“…Data composition, distribution, and reconciliation are the next overarching theme of FL research topics. For a summary of the different kinds of attacks and more information regarding the security-based implementations, please reference the following reviews [9,11,32,33]…”
Section: Securitymentioning
confidence: 99%
“…Ziegler et al evaluated the feasibility of differential private FL on chest X-ray classification [32]. The group focused on the security aspect of FL by introducing Reyni differential privacy with Gaussian noise into the local training model.…”
Section: General Chest X-raysmentioning
confidence: 99%
“…Among the seven selected articles about X-ray images, all of them used CNNs, six used pre-trained models, and almost half of them applied more than one pre-trained model to compare results [21,22,26]. They used ResNet18 and ResNet34, DenseNet121 and ResNet50, VGG-16, and ResNet50, respectively, and share some common models.…”
Section: Magnetic Resonance Imagingmentioning
confidence: 99%