2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021
DOI: 10.1109/cvpr46437.2021.00614
|View full text |Cite
|
Sign up to set email alerts
|

Backdoor Attacks Against Deep Learning Systems in the Physical World

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
44
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 121 publications
(62 citation statements)
references
References 23 publications
0
44
0
Order By: Relevance
“…6. Attack success rate for a Physical space attack [26]. Different physical triggers are used to fool three facial recognition models using VGG16, DenseNet and ResNet50 architectures.…”
Section: B Non-poisoning Based Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…6. Attack success rate for a Physical space attack [26]. Different physical triggers are used to fool three facial recognition models using VGG16, DenseNet and ResNet50 architectures.…”
Section: B Non-poisoning Based Methodsmentioning
confidence: 99%
“…The attacks are generally able to achieve high fooling rates across different models. The figure is adapted from [26].…”
Section: B Non-poisoning Based Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Previous works on backdoor attacks mainly focus on digital attacks that apply digitally generated patterns as triggers. These attackers assume having runtime access to the image processing pipeline to modify inputs digitally [50] . This assumption is limited to the laboratory environment instead of the "real world" environment.…”
Section: ) Real-world Attacksmentioning
confidence: 99%
“…Some attacks use dynamic and/or input-specific triggers [19,31,32,37]. Others use more advanced injection functions [23,32] or GANs [30] to create hidden triggers or use physical objects as triggers [3,49].…”
Section: Limitationsmentioning
confidence: 99%