2023
DOI: 10.1109/jiot.2022.3215188
|View full text |Cite
|
Sign up to set email alerts
|

Attention-Based Adversarial Robust Distillation in Radio Signal Classifications for Low-Power IoT Devices

Abstract: Due to great success of transformers in many applications such as natural language processing and computer vision, transformers have been successfully applied in automatic modulation classification. We have shown that transformer-based radio signal classification is vulnerable to imperceptible and carefully crafted attacks called adversarial examples. Therefore, we propose a defense system against adversarial examples in transformer-based modulation classifications. Considering the need for computationally eff… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 49 publications
0
1
0
Order By: Relevance
“…In neural network architecture and defensive distillation technique (DDT), the input data received from the user devices is used to IRS prediction method. Defensive distillation training networks is covered using a defended model which has deep neural networks with large network and shallow neural networks with small neural network [24], [25]. The overall system design for the proposed AI-powered intelligent reflecting surface system is shown in Figure 3.…”
Section: Work Conceptmentioning
confidence: 99%
“…In neural network architecture and defensive distillation technique (DDT), the input data received from the user devices is used to IRS prediction method. Defensive distillation training networks is covered using a defended model which has deep neural networks with large network and shallow neural networks with small neural network [24], [25]. The overall system design for the proposed AI-powered intelligent reflecting surface system is shown in Figure 3.…”
Section: Work Conceptmentioning
confidence: 99%