2021
DOI: 10.1007/978-3-030-77870-5_22
|View full text |Cite
|
Sign up to set email alerts
|

High-Precision Bootstrapping of RNS-CKKS Homomorphic Encryption Using Optimal Minimax Polynomial Approximation and Inverse Sine Function

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
44
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 64 publications
(44 citation statements)
references
References 19 publications
0
44
0
Order By: Relevance
“…For the first time, we implement the ResNet-20 model for the CIFAR-10 dataset [12] using the residue number system CKKS (RNS-CKKS) [4] FHE scheme, which is a variant of the CKKS scheme using the SEAL library 3.6.1 version [13], one of the most reliable libraries implementing the RNS-CKKS scheme. In addition, we implement bootstrapping of RNS-CKKS scheme in the SEAL library according to [6]- [10] in order to support a large number of homomorphic operations for a deep neural network, as the SEAL library does not support the bootstrapping operation. ResNets are one of the historic convolutional neural network (CNN) models which enable a very deep neural network with high accuracy for complex datasets such as the CIFAR-10 and the ImageNet.…”
Section: A Our Contributionmentioning
confidence: 99%
See 2 more Smart Citations
“…For the first time, we implement the ResNet-20 model for the CIFAR-10 dataset [12] using the residue number system CKKS (RNS-CKKS) [4] FHE scheme, which is a variant of the CKKS scheme using the SEAL library 3.6.1 version [13], one of the most reliable libraries implementing the RNS-CKKS scheme. In addition, we implement bootstrapping of RNS-CKKS scheme in the SEAL library according to [6]- [10] in order to support a large number of homomorphic operations for a deep neural network, as the SEAL library does not support the bootstrapping operation. ResNets are one of the historic convolutional neural network (CNN) models which enable a very deep neural network with high accuracy for complex datasets such as the CIFAR-10 and the ImageNet.…”
Section: A Our Contributionmentioning
confidence: 99%
“…There are several techniques and we elaborate only the techniques we used to implement. Lee et al [10] proposed that the modular reduction is represented by the composition of several functions…”
Section: Bootstrapping Of Ckks Schemementioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, L should be larger than L boot , and having a larger L is beneficial since it requires less frequent bootstrapping ops to execute a HE application with a fixed multiplicative depth. L boot depends on the bootstrapping algorithm and typically ranges from 10 to 20 -larger L boot permits using more precise and faster bootstrapping algorithms but at the cost of more frequent bootstrapping [12], [16], [37], [52]. The bootstrapping algorithm we use in this paper is based on [37] with updates to meet the latest security and precision requirements [12], [19], [54], and has L boot of 19.…”
Section: Multiplicative Level and He Bootstrappingmentioning
confidence: 99%
“…The level of security for the HE scheme is determined by the λ parameter as it determines the minimum logarithmic-time complexity for an attack [19] to deduce the message from a ct without the secret key. In this work, we target λ of 128 bits, similar to recent HE studies [12], [52], [54] and libraries [32], [60]. A prior study, F1 [66], provided a substandard [4] level of security under 80 bits for CKKS bootstrapping and used smaller cts which simplifies the microarchitecture.…”
Section: E Modern Algorithmic Optimizations In Ckks and T Multa/slotmentioning
confidence: 99%