2017
DOI: 10.1007/978-981-10-5421-1_9
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Deep Learning: Revisited and Enhanced

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
46
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 117 publications
(46 citation statements)
references
References 7 publications
0
46
0
Order By: Relevance
“…More recently, Health Insurance Portability and Accountability Act (HIPAA)-compliant storage systems have paved the way for more stringent privacy preservation. Studies have explored systems that enable multiple entities to jointly train AI models without sharing their input data sets — sharing only the trained model 106,107 . Other efforts use a decentralized ‘federated’ learning approach 108 .…”
Section: Ai Challenges In Medical Imagingmentioning
confidence: 99%
“…More recently, Health Insurance Portability and Accountability Act (HIPAA)-compliant storage systems have paved the way for more stringent privacy preservation. Studies have explored systems that enable multiple entities to jointly train AI models without sharing their input data sets — sharing only the trained model 106,107 . Other efforts use a decentralized ‘federated’ learning approach 108 .…”
Section: Ai Challenges In Medical Imagingmentioning
confidence: 99%
“…Aono et al [49] show that, in the collaborative deep learning protocol of [52], an honest-but-curious server can partially recover participants' training inputs from their gradient updates under the (greatly simplified) assumption that the batch consists of a single input. Furthermore, the technique is evaluated only on MNIST where all class members are visually similar.…”
Section: Related Workmentioning
confidence: 99%
“…This approach does not require clients to send their local data to the central server, but several works show that the clients' model updates leak information about their local data [53], [113], [120]. To counter this, some works focus on secure aggregation techniques for distributed NNs, based on HE [93], [94] or MPC [22]. Although encrypting the gradient values prevents the leakage of parties' confidential data to the central server, these solutions do not account for potential leakage from the aggregate values themselves.…”
Section: Related Workmentioning
confidence: 99%