2020
DOI: 10.48550/arxiv.2006.14580
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Backdoor Attacks Against Deep Learning Systems in the Physical World

Abstract: Backdoor attacks embed hidden malicious behaviors inside deep neural networks (DNNs) that are only activated when a specific "trigger" is present in some input to the model. A variety of these attacks have been successfully proposed and evaluated, generally using digitally generated patterns or images as triggers. Despite significant prior work on the topic, a key question remains unanswered: "can backdoor attacks be physically realized in the real world, and what limitations do attackers face in executing the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(18 citation statements)
references
References 39 publications
0
18
0
Order By: Relevance
“…In addition, previous backdoor attacks usually uses digital triggers rather than natural physical triggers. Only a few recent works [18]- [20] consider the usage of natural triggers and of their focus has been on image classification task. Therefore, there is a lack of study on the efficiency of backdoor attack in real physical world used object detectors.…”
Section: B Backdoor Attacks On Deep Learningmentioning
confidence: 99%
“…In addition, previous backdoor attacks usually uses digital triggers rather than natural physical triggers. Only a few recent works [18]- [20] consider the usage of natural triggers and of their focus has been on image classification task. Therefore, there is a lack of study on the efficiency of backdoor attack in real physical world used object detectors.…”
Section: B Backdoor Attacks On Deep Learningmentioning
confidence: 99%
“…Backdoor attacks in real physical world. To the best of the authors' knowledge, there is only one work conducted on the physical backdoor attacks [13] (an unpublished work in arXiv). Wenger et al [13] collect 9 different facial accessories in the real world as backdoor triggers, and evaluate the attack effectiveness of these backdoors on the face recognition models.…”
Section: Related Workmentioning
confidence: 99%
“…To the best of the authors' knowledge, there is only one work conducted on the physical backdoor attacks [13] (an unpublished work in arXiv). Wenger et al [13] collect 9 different facial accessories in the real world as backdoor triggers, and evaluate the attack effectiveness of these backdoors on the face recognition models. However, their attacks are carried out under the ideal physical conditions, where the attacker is facing the camera with a proper distance.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations