2022
DOI: 10.1155/2022/5199248
|View full text |Cite|
|
Sign up to set email alerts
|

English-Chinese Machine Translation Model Based on Bidirectional Neural Network with Attention Mechanism

Abstract: In recent years, with the development of deep learning, machine translation using neural network has gradually become the mainstream method in industry and academia. The existing Chinese-English machine translation models generally adopt the deep neural network architecture based on attention mechanism. However, it is still a challenging problem to model short and long sequences simultaneously. Therefore, a bidirectional LSTM model integrating attention mechanism is proposed. Firstly, by using the word vector … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…As such, translators proficient in both languages play a vital role in bridging linguistic and cultural gaps in our increasingly diverse and interconnected world. The design and application of an English-Chinese bilingual teaching model based on multimodal learning represents a significant advancement in educational methodology [9]. Multimodal learning integrates various sensory modalities, such as visual, auditory, and kinesthetic, to enhance the learning experience.…”
Section: Introductionmentioning
confidence: 99%
“…As such, translators proficient in both languages play a vital role in bridging linguistic and cultural gaps in our increasingly diverse and interconnected world. The design and application of an English-Chinese bilingual teaching model based on multimodal learning represents a significant advancement in educational methodology [9]. Multimodal learning integrates various sensory modalities, such as visual, auditory, and kinesthetic, to enhance the learning experience.…”
Section: Introductionmentioning
confidence: 99%
“…As the RNN model has reached its performance bottleneck, the new self-attention mechanism [2] becomes the favorite choice for the translation system design. Machine translation models based on multilayer self-attention (dubbed Transformer) have demonstrated improved translation quality on various large-scale challenges in recent years [7][8][9][10]. For instance, the BERT translation engine [11] has a variable number of encoder layers and self-attention heads.…”
Section: Introductionmentioning
confidence: 99%
“…The attention mechanism (AM) was used in machine translation and achieved good results. Subsequently, it was found that introducing attention mechanisms into various neural network fields has a good training effect(Yonglan & Wenjia, 2022;Liu & Guo, 2019).Artificial Intelligence (AI) is based on deep learning, including many neural networks, so the attention mechanism has a good training effect in the field of Artificial Intelligence, attention mechanism has become an important part of neural networks. It has been widely used in the fields of face recognition, natural language processing, and audio processing.…”
mentioning
confidence: 99%