2022
DOI: 10.1109/tifs.2021.3138611
|View full text |Cite
|
Sign up to set email alerts
|

Leia: A Lightweight Cryptographic Neural Network Inference System at the Edge

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
1
1

Relationship

2
8

Authors

Journals

citations
Cited by 25 publications
(7 citation statements)
references
References 25 publications
0
7
0
Order By: Relevance
“…4) Security Computation: Security computation has also been widely adopted to protect data privacy in general data analytics systems and services. There are three main types of techniques for security computation, including Trusted Executive Environment (TEE) [213], [214], Homomorphic Encryption (HE) [215], [216], and Multi-party Secure Computation (MPC) [217], [218], [219]. Note that, since the underlying operations (e.g., matrix multiplication) in GNNs are similar to those in ordinary DNNs, most of the existing security computation methods in DNNs can be directly extended to GNNs.…”
Section: Methods Category Taskmentioning
confidence: 99%
“…4) Security Computation: Security computation has also been widely adopted to protect data privacy in general data analytics systems and services. There are three main types of techniques for security computation, including Trusted Executive Environment (TEE) [213], [214], Homomorphic Encryption (HE) [215], [216], and Multi-party Secure Computation (MPC) [217], [218], [219]. Note that, since the underlying operations (e.g., matrix multiplication) in GNNs are similar to those in ordinary DNNs, most of the existing security computation methods in DNNs can be directly extended to GNNs.…”
Section: Methods Category Taskmentioning
confidence: 99%
“…Different from the above works, we foucs on reducing the SLO violation rate caused by the interference of multimodel. In addition, some edge inference frameworks involve privacy protection [21], [22] and edge-cloud collaborative [23], [24], respectively. These works are orthogonal to BCEdge that can alleviate privacy and resource constraints.…”
Section: A Model-level Dnn Inference Servicementioning
confidence: 99%
“…Well known semiconductor companies like Xilinx, Intel, and Apple are showing interest in BNNs due to these advantages [43,84,91]. Owing to their low memory footprint and lightweight nature of operations, BNNs are considered attractive for edge applications such as FPGA-accelerators [25], cryptographic neural network inference systems [50], and for designing low-bitwidth ConvNets [102], among many other applications. Despite their low-bitwidth operations, the accuracy obtained by BNNs is comparable to that obtained by full-precision neural networks.…”
Section: Binarized Neural Networkmentioning
confidence: 99%