Proceedings of the 28th ACM International Conference on Architectural Support for Programming Languages and Operating Systems, 2023
DOI: 10.1145/3582016.3582065
|View full text |Cite
|
Sign up to set email alerts
|

Characterizing and Optimizing End-to-End Systems for Private Inference

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 59 publications
0
2
0
Order By: Relevance
“…HE introduces a large computational overhead of 4-6 orders of magnitude slowdown over their plaintext counterparts (1080 seconds for a single ResNet-18 inference [10,38]). Rather than directly using HE for computing linear layers, it is common to combine HE with Additive Secret Sharing (SS), another cryptographic building block that supports plaintext-level speeds for computing linear layers.…”
Section: Private Linear Computation (He Ss)mentioning
confidence: 99%
See 1 more Smart Citation
“…HE introduces a large computational overhead of 4-6 orders of magnitude slowdown over their plaintext counterparts (1080 seconds for a single ResNet-18 inference [10,38]). Rather than directly using HE for computing linear layers, it is common to combine HE with Additive Secret Sharing (SS), another cryptographic building block that supports plaintext-level speeds for computing linear layers.…”
Section: Private Linear Computation (He Ss)mentioning
confidence: 99%
“…The combination of the proposed optimizations improves the mean inference latency by 1.8× over the state of the art. [10] Besides the system-level optimizations, we also review two hardware accelerators, HAAC [30] and RPU [40], to accelerate GCs and HE, respectively. HAAC aims at reducing the computation latency for the garbled circuits generation/evaluation, while RPU targets the offline HE computation overhead.…”
Section: Introductionmentioning
confidence: 99%