2021 International Conference on INnovations in Intelligent SysTems and Applications (INISTA) 2021
DOI: 10.1109/inista52262.2021.9548609
|View full text |Cite
|
Sign up to set email alerts
|

DCW-RNN: Improving Class Level Metrics for Software Vulnerability Detection Using Artificial Immune System with Clock-Work Recurrent Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 7 publications
0
4
0
Order By: Relevance
“…Reasoning about processes at multiple time scales is facilitated by Clock-Work RNN (CW-RNN) models, making calculations only at the prescribed clock rate. Neurons of various modules are connected on the basis of the modules' clock periods [23].…”
Section: Methodsmentioning
confidence: 99%
“…Reasoning about processes at multiple time scales is facilitated by Clock-Work RNN (CW-RNN) models, making calculations only at the prescribed clock rate. Neurons of various modules are connected on the basis of the modules' clock periods [23].…”
Section: Methodsmentioning
confidence: 99%
“…Reasoning about processes at multiple time scales is facilitated by Clock-Work RNN (CW-RNN) models, making calculations solely at the prescribed clock rate. Neurons of various modules are connected on the basis of the modules'clock periods [14].…”
Section: Methodsmentioning
confidence: 99%
“…Limitation ②: Their GNN architecture still cannot effectively capture meaningful deep dependencies and semantics of source code. The existing DL-based methods rely on GNN or RNN-based [37] architectures to generate code representations, which can encounter problems when dealing with complex sequences. The RNN model processes the sequence token by token, where the model considers the context vector of the previous token for long-term dependencies when processing each token.…”
Section: Research Movitationmentioning
confidence: 99%