2022
DOI: 10.1109/access.2022.3151980
|View full text |Cite
|
Sign up to set email alerts
|

Fully Complex Deep Learning Classifiers for Signal Modulation Recognition in Non-Cooperative Environment

Abstract: Deep learning (DL) classifiers have significantly outperformed traditional likelihood-based or feature-based classifiers for signal modulation recognition in non-cooperative environments. However, despite these recent improvements, the conventional DL classifiers still have an unintended problem in handling the received signal in which the in-phase and quadrature components are separated. Even though the two components seem to be individually uncorrelated to each other, they are definitely the theoretical real… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
0
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 36 publications
0
0
0
Order By: Relevance
“…The goal of GRU is to address the issues of gradient disappearance and gradient explosion in the training process of lengthy sequences. In this paper, the modulus of the real and imaginary features extracted from the previous network layer is taken as the input of this network layer to realize the complex-valued GRU, whose expression is shown in Equation (12). Generally, to normalize the predictions to a probability distribution, softmax is typically the final step in DL-AMC models.…”
Section: Signal Modelmentioning
confidence: 99%
See 4 more Smart Citations
“…The goal of GRU is to address the issues of gradient disappearance and gradient explosion in the training process of lengthy sequences. In this paper, the modulus of the real and imaginary features extracted from the previous network layer is taken as the input of this network layer to realize the complex-valued GRU, whose expression is shown in Equation (12). Generally, to normalize the predictions to a probability distribution, softmax is typically the final step in DL-AMC models.…”
Section: Signal Modelmentioning
confidence: 99%
“…Generally, to normalize the predictions to a probability distribution, softmax is typically the final step in DL-AMC models. In this paper, an extension of the real softmax to the complex domain is achieved by using the magnitude of the complex data shown in Equation ( 13) [12]. (8) According to Equations ( 6) and ( 8), it could be found that Z can be obtained by a linear combination of the columns and Z DL , which can be shown in Equation (9).…”
Section: Signal Modelmentioning
confidence: 99%
See 3 more Smart Citations