Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security 2017
DOI: 10.1145/3133956.3134056
|View full text |Cite
|
Sign up to set email alerts
|

Oblivious Neural Network Predictions via MiniONN Transformations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
512
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 548 publications
(515 citation statements)
references
References 29 publications
2
512
0
1
Order By: Relevance
“…The final polynomial objective function used for training is given in Eq. (17). Lemma 4: Let S and S be any two neighboring databases.…”
Section: Secprobe: the Participant Partmentioning
confidence: 99%
See 1 more Smart Citation
“…The final polynomial objective function used for training is given in Eq. (17). Lemma 4: Let S and S be any two neighboring databases.…”
Section: Secprobe: the Participant Partmentioning
confidence: 99%
“…In [16], homomorphic encryption was first applied to convolutional neural networks (CNNs), where the model was trained in a centralized manner, and it required extensive computation resources. Subsequently, many other works tried arXiv:1812.10113v3 [cs.CR] 25 Oct 2019 to make inferences on encrypted data, e.g., [17]- [21], etc. However, although the recent schemes have improved the efficiency significantly, they often lead to much higher overheads as compared to computing on the original plaintext data.…”
mentioning
confidence: 99%
“…Finally, we assume the use of secure channel, which can be instantiated by the transport layer security (TLS) [28]. This setting is the same as that in other literature [5], [15].…”
Section: Security and Network Settingsmentioning
confidence: 99%
“…Processing on Encrypted Data At processing phase, SMC has led to cooperative solutions where several devices work together to obtain federated inferences [Liu et al, 2017], not supporting deployment of the trained DNN to trusted decentralized systems. DNN processing on FHE encrypted data is covered in CryptoNets [Gilad-Bachrach et al, 2016] More recently, in [Boemer et al, 2018], the authors proposed a privacy-preserving framework for deep learning, making use of the SEAL [SEAL, 2018] FHE library.…”
Section: Problem Statementmentioning
confidence: 99%