2019 IEEE/CVF International Conference on Computer Vision (ICCV) 2019
DOI: 10.1109/iccv.2019.00665
|View full text |Cite
|
Sign up to set email alerts
|

Adversarial Learning With Margin-Based Triplet Embedding Regularization

Abstract: The Deep neural networks (DNNs) have achieved great success on a variety of computer vision tasks, however, they are highly vulnerable to adversarial attacks. To address this problem, we propose to improve the local smoothness of the representation space, by integrating a margin-based triplet embedding regularization term into the classification objective, so that the obtained model learns to resist adversarial examples. The regularization term consists of two steps optimizations which find potential perturbat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 40 publications
(26 citation statements)
references
References 39 publications
0
26
0
Order By: Relevance
“…Given a face pair and a deep face model, [75] proposed feature-level attacks to compare the face pair via calculating the distance between their normalized deep representations. These representations are similar to the embedding features, except that they are normalized and extracted from the deep face model.…”
Section: ) Feature Fast and Iterative Attack Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Given a face pair and a deep face model, [75] proposed feature-level attacks to compare the face pair via calculating the distance between their normalized deep representations. These representations are similar to the embedding features, except that they are normalized and extracted from the deep face model.…”
Section: ) Feature Fast and Iterative Attack Methodsmentioning
confidence: 99%
“…The empirical results validated the effectiveness of the proposed method on the LFW benchmark dataset. Zhong and Deng [75] offered to recover the local smoothness of the representation space by integrating a margin-based triplet embedding regularization (MTER) term into the classification objective so that the acquired model learns to resist adversarial examples. The regularization term consists of a two-phase optimization that detects probable perturbations and punishes those using a large margin in an iterative approach.…”
Section: ) Changing the Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…Regarding regularization [37,38,39,40] and normalization [41,42,43,44] techniques, they can help to achieve a better input abstraction or faster convergence. In [37], for example, a region dropout is applied to the network input as a data augmentation strategy.…”
Section: Related Workmentioning
confidence: 99%
“…In [37], for example, a region dropout is applied to the network input as a data augmentation strategy. The authors of [38], instead, develop an elastic regularization strategy to capture differences among diverse inputs; while in [39], a regularization term is applied to smooth the network output and avoid misclassifying wrong inputs. Regularization procedures can also be defined for internal layers as shown by [40], where a parametrized regularization is designed to improve the model performance by accounting for both network filters and penalty functions.…”
Section: Related Workmentioning
confidence: 99%