2022
DOI: 10.48550/arxiv.2202.01811
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ObjectSeeker: Certifiably Robust Object Detection against Patch Hiding Attacks via Patch-agnostic Masking

Abstract: Object detectors, which are widely deployed in securitycritical systems such as autonomous vehicles, have been found vulnerable to physical-world patch hiding attacks. The attacker can use a single physically-realizable adversarial patch to make the object detector miss the detection of victim objects and completely undermines the functionality of object detection applications. In this paper, we propose ObjectSeeker as a defense framework for building certifiably robust object detectors against patch hiding at… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 42 publications
(109 reference statements)
0
1
0
Order By: Relevance
“…[35] leverage that patched images cluster Superfically Important Neurons (SINs), thus proposing a certified defense relying on SIN-based sparsification. Xiang et al [36] draw intuition from the aforementioned first class of techniques, proposing a patch-agnostic masking method that eliminates adversarial pixels, with the robustness intrinsically certifiable by additionally removing duplicate bounding boxes for detected objects. Although efficient in eliminating patch attacks, these techniques increase the computational cost during inference, thus limiting their applicability in real-time settings.…”
Section: Defense Against Adversarial Patchesmentioning
confidence: 99%
“…[35] leverage that patched images cluster Superfically Important Neurons (SINs), thus proposing a certified defense relying on SIN-based sparsification. Xiang et al [36] draw intuition from the aforementioned first class of techniques, proposing a patch-agnostic masking method that eliminates adversarial pixels, with the robustness intrinsically certifiable by additionally removing duplicate bounding boxes for detected objects. Although efficient in eliminating patch attacks, these techniques increase the computational cost during inference, thus limiting their applicability in real-time settings.…”
Section: Defense Against Adversarial Patchesmentioning
confidence: 99%