2022
DOI: 10.1155/2022/1270700
|View full text |Cite
|
Sign up to set email alerts
|

Two-Way Neural Network Chinese-English Machine Translation Model Fused with Attention Mechanism

Abstract: This study uses an end-to-end encoder-decoder structure to build a machine translation model, allowing the machine to automatically learn features and transform the corpus data into distributed representations. The word vector uses a neural network to achieve direct mapping. Research on constructing neural machine translation models for different neural network structures. Based on the translation model of the LSTM network, the gate valve mechanism reduces the gradient attenuation and improves the ability to p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 28 publications
0
1
0
Order By: Relevance
“…Sennrich et al [16] proposed an "encoding-decoding" model of neural machine translation and explained the encoding rules. Liang and Du [17] used a convolutional neural network to construct the encoder and a recurrent neural network for the decoder to obtain historical information and process variable-length strings. e current mainstream neural network machine translation approach is an end-to-end codec translation system built using recurrent neural network (RNN) models.…”
Section: Introductionmentioning
confidence: 99%
“…Sennrich et al [16] proposed an "encoding-decoding" model of neural machine translation and explained the encoding rules. Liang and Du [17] used a convolutional neural network to construct the encoder and a recurrent neural network for the decoder to obtain historical information and process variable-length strings. e current mainstream neural network machine translation approach is an end-to-end codec translation system built using recurrent neural network (RNN) models.…”
Section: Introductionmentioning
confidence: 99%