2022
DOI: 10.1155/2022/7068406
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Study of Text Genres in English-Chinese Translation Effects Based on Deep Learning LSTM

Abstract: In recent years, neural network-based English-Chinese translation models have gradually supplanted traditional translation methods. The neural translation model primarily models the entire translation process using the “encoder-attention-decoder” structure. Simultaneously, grammar knowledge is essential for translation, as it aids in the grammatical representation of word sequences and reduces grammatical errors. The focus of this article is on two major studies on attention mechanisms and grammatical knowledg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…Zhao and Jin [24] have suggested that traditional translation techniques have gradually been replaced by English-Chinese translation models based on neural networks. The "encoder-attention-decoder" The neural translation model's primary tool for simulating the entire translation process is structure.…”
Section: Methodsmentioning
confidence: 99%
“…Zhao and Jin [24] have suggested that traditional translation techniques have gradually been replaced by English-Chinese translation models based on neural networks. The "encoder-attention-decoder" The neural translation model's primary tool for simulating the entire translation process is structure.…”
Section: Methodsmentioning
confidence: 99%
“…The study highlights the importance of considering the relationship between words and categories in the classification of news texts, which is often neglected in previous approaches. English-Chinese translation models based on neural networks have replaced traditional methods, and [16] focuses on attention mechanisms and grammatical knowledge in translation models. It proposes a translation model based on the integration of LSTM attention and the LSTM model combined with prior grammatical knowledge, to improve the representation of source language contextual information and enhance translation quality.…”
Section: Related Workmentioning
confidence: 99%
“…Whereas the LSTM model combined with prior syntactic knowledge, using simple identifiers to identify target terms as a group, helped the translation model to better integrate terminological knowledge during training and learn the semantic relationship between target terms and source statements. [16] points out that integrating the neural machine translation model can improve the neural machine translation model and enhance translation quality. In [17], abstract summarization of Arabic texts is considered a difficult task due to the complexity of the language.…”
Section: Related Workmentioning
confidence: 99%