2024
DOI: 10.48084/etasr.7479
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing Neural Network Resilence against Adversarial Attacks based on FGSM Technique

Mohamed Ben Ammar,
Refka Ghodhbani,
Taoufik Saidani

Abstract: The robustness and reliability of neural network architectures are put to the test by adversarial attacks, resulting in inaccurate findings and affecting the efficiency of applications operating on Internet of Things (IoT) devices. This study investigates the severe repercussions that might emerge from attacks on neural network topologies and their implications on embedded systems. In particular, this study investigates the degree to which a neural network trained in the MNIST dataset is susceptible to adversa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
references
References 15 publications
0
0
0
Order By: Relevance