Proceedings of the 30th ACM International Conference on Multimedia 2022
DOI: 10.1145/3503161.3548171
|View full text |Cite
|
Sign up to set email alerts
|

Physical Backdoor Attacks to Lane Detection Systems in Autonomous Driving

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 23 publications
(17 citation statements)
references
References 31 publications
0
6
0
Order By: Relevance
“…The work [101] developed a novel color stripe pattern trigger, which could be generated and projected onto real faces through a flickering LED in a specialized waveform. Besides, the work [78] explored the physical backdoor attack against the lane-detection model, which played a critical role in autonomous driving systems. It designed a set of two traffic cones with specific shapes and positions as the trigger, and changed the output lane in poisoned samples.…”
Section: Backdoor Attacks With Different Triggersmentioning
confidence: 99%
“…The work [101] developed a novel color stripe pattern trigger, which could be generated and projected onto real faces through a flickering LED in a specialized waveform. Besides, the work [78] explored the physical backdoor attack against the lane-detection model, which played a critical role in autonomous driving systems. It designed a set of two traffic cones with specific shapes and positions as the trigger, and changed the output lane in poisoned samples.…”
Section: Backdoor Attacks With Different Triggersmentioning
confidence: 99%
“…1) General Physical Attack Methods: Adversarial patches [27] are widely used for physical attacks, such as face recognition [28]- [30], object detection [24], [25], [31]- [33], autonomous driving [34], [35], etc. We review the related work according to application domains as follows:…”
Section: B Physical Attackmentioning
confidence: 99%
“…Autonomous driving: Cheng et al [35] proposed an optimization-based method to generate stealthy physicalobject-oriented adversarial patches to attack depth estimation. In [34], the authors realized the first physical backdoor attacks on the lane detection system, including two attack methodologies (poison-annotation and clean-annotation) to generate poisoned samples. [35] adopt an optimization-based approach to craft stealthy physical-object-oriented adversarial patches to fool depth estimation algorithms.…”
Section: B Physical Attackmentioning
confidence: 99%
“…The proposed method involves placing target objects onto a universal background image and manipulating the local pixel data surrounding the target objects in a way that renders them unrecognizable by object detectors. The focus of the study [196] is on the lane detection system, a crucial component in numerous autonomous driving applications, such as navigation and lane switching. The researchers design and realize the first physical backdoor attacks on such systems.…”
Section: Black-box Attacksmentioning
confidence: 99%