2022
DOI: 10.1109/access.2022.3159694
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Machine Learning With Fully Homomorphic Encryption for Deep Neural Network

Abstract: Fully homomorphic encryption (FHE) is one of the prospective tools for privacy-preserving machine learning (PPML), and several PPML models have been proposed based on various FHE schemes and approaches. Although the FHE schemes are known as suitable tools to implement PPML models, previous PPML models on FHE such as CryptoNet, SEALion, and CryptoDL are limited to only simple and non-standard types of machine learning models. These non-standard machine learning models are not proven efficient and accurate with … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
172
0
2

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 190 publications
(174 citation statements)
references
References 25 publications
0
172
0
2
Order By: Relevance
“…We measured the performance of HELR for 30 iterations. We also performed CNN inference using Lee et al [53]'s implementation of the ResNet-20 model for the CIFAR-10 [52] dataset. The model shows 92.43% accuracy.…”
Section: B Experimental Setupmentioning
confidence: 99%
See 3 more Smart Citations
“…We measured the performance of HELR for 30 iterations. We also performed CNN inference using Lee et al [53]'s implementation of the ResNet-20 model for the CIFAR-10 [52] dataset. The model shows 92.43% accuracy.…”
Section: B Experimental Setupmentioning
confidence: 99%
“…We additionally used the channel packing method of [47], which packs multiple channels of a feature map in a single message to fully utilize the slots. When using the channel packing method, homomorphic evaluation of convolution layers in the model involves a series of HRots and PMults with weight plaintexts, so we applied both multi- [36], [53].…”
Section: B Experimental Setupmentioning
confidence: 99%
See 2 more Smart Citations
“…Noise accumulates as we apply a sequence of computations on cts. This limits the number of computations that can be performed and hinders the applicability of HE for practical purposes, such as in deep-learning models with high accuracy [59]. To overcome this limitation, fully HE (FHE) [37] was proposed, featuring an operation (op) called bootstrapping, that łrefreshesž the ct and hence permits an unlimited number of computations on the ct.…”
Section: Introductionmentioning
confidence: 99%