2023
DOI: 10.1007/978-3-031-25056-9_21
|View full text |Cite
|
Sign up to set email alerts
|

TransPatch: A Transformer-based Generator for Accelerating Transferable Patch Generation in Adversarial Attacks Against Object Detection Models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 37 publications
0
1
0
Order By: Relevance
“…In [192], the authors introduce the Differentiable Transformation Attack (DTA), which enables the creation of patterns that can effectively hide the object from detection, while also taking into account the impact of various transformations that the object may undergo. Wang et al [193] introduce a novel training pipeline called TransPatch to optimize the training efficiency of adversarial patches. To avoid generating conspicuous and attention-grabbing patterns, [160] propose to create physical adversarial patches by leveraging the image manifold of a pre-trained GAN.…”
Section: Black-box Attacksmentioning
confidence: 99%
“…In [192], the authors introduce the Differentiable Transformation Attack (DTA), which enables the creation of patterns that can effectively hide the object from detection, while also taking into account the impact of various transformations that the object may undergo. Wang et al [193] introduce a novel training pipeline called TransPatch to optimize the training efficiency of adversarial patches. To avoid generating conspicuous and attention-grabbing patterns, [160] propose to create physical adversarial patches by leveraging the image manifold of a pre-trained GAN.…”
Section: Black-box Attacksmentioning
confidence: 99%