2018
DOI: 10.48550/arxiv.1812.11720
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Stealing Neural Networks via Timing Side Channels

Vasisht Duddu,
Debasis Samanta,
D Vijay Rao
et al.

Abstract: Deep learning is gaining importance in many applications. However, Neural Networks face several security and privacy threats. This is particularly significant in the scenarios where Cloud infrastructures deploy a service with Neural Network models at the back end. Here, an adversary can extract the Neural Network parameters, infer the regularization hyperparameter, identify if a data point was part of the training data, and generate effective transferable adversarial examples to evade classifiers. This paper s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

1
33
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(34 citation statements)
references
References 26 publications
1
33
0
Order By: Relevance
“…Similarly, [50] presented an attack on an FPGA-based convolutional neural network accelerator and recovered the input image from the collected power traces. [16] proposed an extraction attack by exploiting the side timing channels to infer the depth of the network. [48] designed an attack on stealing the hyper-parameters of a variety of machine learning algorithms, this attack is derived by know parameters and the machine learning algorithms, and training data set.…”
Section: Mitigation Countermeasuresmentioning
confidence: 99%
See 2 more Smart Citations
“…Similarly, [50] presented an attack on an FPGA-based convolutional neural network accelerator and recovered the input image from the collected power traces. [16] proposed an extraction attack by exploiting the side timing channels to infer the depth of the network. [48] designed an attack on stealing the hyper-parameters of a variety of machine learning algorithms, this attack is derived by know parameters and the machine learning algorithms, and training data set.…”
Section: Mitigation Countermeasuresmentioning
confidence: 99%
“…This privatization-deployment situation further exacerbates the risk of model leakage. There have been many DNN extraction works proposed in the literature [16,21,22,36,43,47,48,50,51,54]. All of them use either a search or prediction method to recover DNN models.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Query-based attacks can be effectively defended by limiting model output. Side-channel attacks have also been demonstrated as effective for DNN model extraction [3,8,9,12,13,[34][35][36]. Side-channel attacks offer a significant risk because side-channel information emanating during DNN execution is difficult to eliminate.…”
Section: Introductionmentioning
confidence: 99%
“…Hu et al, extracted the DNN architecture from the noisy PCIe and memory-bus events on a GPU platform [12]. Duddu et al utilized execution time to infer target DNN layer depth [9].…”
Section: Introductionmentioning
confidence: 99%