2020
DOI: 10.1609/aaai.v34i04.5799
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Gaussian Process Regression – A Modular Approach to the Application of Homomorphic Encryption

Abstract: Much of machine learning relies on the use of large amounts of data to train models to make predictions. When this data comes from multiple sources, for example when evaluation of data against a machine learning model is offered as a service, there can be privacy issues and legal concerns over the sharing of data. Fully homomorphic encryption (FHE) allows data to be computed on whilst encrypted, which can provide a solution to the problem of data privacy. However, FHE is both slow and restrictive, so existing … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(6 citation statements)
references
References 7 publications
0
6
0
Order By: Relevance
“…HE allows models to use encrypted data as input and ouput encrypted results, then the clients can decrypt them and get the results they want without data leakage. Many basic machine learning algorithms can be implemented with HE (Du et al, 2017;Bost et al, 2014;Fenner & Pyzer-Knapp, 2020). In some cases, data from different enterprises must be used to obtain a better model, but no one wants to leak their data.…”
Section: Privacy-preserving Methods For Machine Learningmentioning
confidence: 99%
“…HE allows models to use encrypted data as input and ouput encrypted results, then the clients can decrypt them and get the results they want without data leakage. Many basic machine learning algorithms can be implemented with HE (Du et al, 2017;Bost et al, 2014;Fenner & Pyzer-Knapp, 2020). In some cases, data from different enterprises must be used to obtain a better model, but no one wants to leak their data.…”
Section: Privacy-preserving Methods For Machine Learningmentioning
confidence: 99%
“…Finally, HE has been introduced in constructing the GP [25] for protecting data privacy. HE requires that the ciphertext can be directly operated algebraically (typically by addition or multiplication), and the result must be the same as that obtained by using plaintext operations followed by encryption.…”
Section: B Privacy-preserving Mechanisms In Bayesian Optimizationmentioning
confidence: 99%
“…HE is also studied along the line of BO to handle expensive problems, where computational cost is a key consideration. To reduce the computational cost, a modular approach to GP modeling is suggested [25]. As an extension of privacy-preserving GP regression to BO, the GP must be updated iteratively, making it computationally prohibitive.…”
Section: B Privacy-preserving Mechanisms In Bayesian Optimizationmentioning
confidence: 99%
“…being exposed to the cloud server. A lot of work [91,79,66,37,174,125,168,100,142,169,34,163,97,271,164] emerged in order to address data leakage in the inference phase, and we will describe these studies in detail afterwards.…”
Section: Threats In Development Of Federated Learningmentioning
confidence: 99%
“…By using polynomials to approximate nonlinear activation functions, such as ReLU, Sigmoid, etc., they achieve the expected prediction performance of 99% on MNIST. Subsequently, to address the problem that HE is difficult to achieve secure computation of nonlinear operations, Fenner and Pyzer-Knapp [66] proposes a way of server-user interaction. Specifically, the server first completes the linear computation of SFI in the ciphertext state and sends the computation result to the user.…”
Section: Model Securitymentioning
confidence: 99%