2021
DOI: 10.48550/arxiv.2105.10879
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Precise Approximation of Convolutional Neural Networks for Homomorphically Encrypted Data

Abstract: Many industries with convolutional neural network models offer privacy-preserving machine learning (PPML) classification service, that is, a service that performs classification of private data for clients while guaranteeing privacy. This work aims to study deep learning on the encrypted data using fully homomorphic encryption (FHE). To implement deep learning on FHE, ReLU and max-pooling functions should be approximated by some polynomials for homomorphic operations. Since the approximate polynomials studied … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 15 publications
0
10
0
Order By: Relevance
“…It is worth noting that there is usually a significant accuracy penalty for this optimization (Garimella et al, 2021). Another recent work shows that to achieve reasonable accuracy on the ImageNet-1k dataset (Deng et al, 2009), a polynomial activation function must be of degree 29 needed (Lee et al, 2021). DELPHI and SAFENet partially substituted some of a network's ReLUs with low-degree polynomials assuming a baseline model is provided as input.…”
Section: Categorization Of Pi Methodsmentioning
confidence: 99%
“…It is worth noting that there is usually a significant accuracy penalty for this optimization (Garimella et al, 2021). Another recent work shows that to achieve reasonable accuracy on the ImageNet-1k dataset (Deng et al, 2009), a polynomial activation function must be of degree 29 needed (Lee et al, 2021). DELPHI and SAFENet partially substituted some of a network's ReLUs with low-degree polynomials assuming a baseline model is provided as input.…”
Section: Categorization Of Pi Methodsmentioning
confidence: 99%
“…The lack of non-linear functions introduces other difficulties. For example, the ReLU activation function must be approximated using a high-degree polynomial [47]. As a result, faithfully replicating deep neural networks in FHE, as done by a recent ResNet implementation [48], comes at a high compute cost.…”
Section: Fhe Interfacementioning
confidence: 99%
“…In addition, authors in [17] proposed a method to approximate the ReLU function precisely using the approximation…”
Section: Application To Min/max and Relu Functionmentioning
confidence: 99%
“…This comparison operation is widely used in various real-world applications, including machine learning algorithms such as support-vector machines, cluster analysis, and gradient boosting [15], [16]. The max function and the rectified linear unit (ReLU) function are other essential nonarithmetic operations that are widely used in deep learning applications [17], [18]. These three non-arithmetic operations can all be implemented using the sign function sgn(x), that is,…”
Section: Introductionmentioning
confidence: 99%