2019
DOI: 10.1007/s00500-019-04281-z
|View full text |Cite
|
Sign up to set email alerts
|

RNN-LSTM-GRU based language transformation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 26 publications
(9 citation statements)
references
References 28 publications
0
8
0
1
Order By: Relevance
“…e former involves the tree search algorithm, and the connection part of the two modules is the attention force mechanism. While the latter often applies important technologies such as word vectors and long-and short-term memory [4] networks. e encoder transforms the input source language text into vector representation in vector space through a neural network.…”
Section: Analysis Of Traditional Neural Machine Translation Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…e former involves the tree search algorithm, and the connection part of the two modules is the attention force mechanism. While the latter often applies important technologies such as word vectors and long-and short-term memory [4] networks. e encoder transforms the input source language text into vector representation in vector space through a neural network.…”
Section: Analysis Of Traditional Neural Machine Translation Methodsmentioning
confidence: 99%
“…e rise and development of machine translation after World War II mainly benefited from the invention and application of the first batch of computers [4]. At the same time, the development of cryptography during World War II and the interest in language research led people to realize that machine translation can be seen as a process of coding in one language and decoding in another.…”
Section: Introductionmentioning
confidence: 99%
“…The LSTM-RNN is one of the most powerful neural network models that is used in cyber security due to its ability to accurately model temporal sequences and their long-term dependencies [44]. However, LSTM usually takes a longer time for model training and high computation cost [45].…”
Section: Proposed Modelmentioning
confidence: 99%
“…The reset gate is used to decide how much of the past information should be forgotten or remembered. The update gate determines how much of the information from the previous time steps should be transferred to the future (Khan and Sarfaraz, 2019). The network structure of the GRU consists of blocks of gated recurrent units for memory reset and updating control.…”
Section: Gated Recurrent Unit (Gru)mentioning
confidence: 99%