2022
DOI: 10.1155/2022/3909726
|View full text |Cite|
|
Sign up to set email alerts
|

LSTM-Based Attentional Embedding for English Machine Translation

Abstract: In order to reduce the workload of manual grading and improve the efficiency of grading, a computerized intelligent grading system for English translation based on natural language processing is designed. An attention-embedded LSTM English machine translation model is proposed. Firstly, according to the characteristics of the standard LSTM network model that uses fixed dimensional vectors to represent words in the encoding stage, an English machine translation model based on LSTM attention embedding is establi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(4 citation statements)
references
References 27 publications
0
4
0
Order By: Relevance
“…The following method is the statistical method for the amount of thinking: the proportion of the average number of Chinese/English words in the average total number of words (the total number of Chinese and English words); the distribution of Chinese in each thinking activity (the average number of Chinese words in a thinking activity/the average total number of Chinese words); the proportion of the average number of Chinese/English words in the total amount of thinking in each thinking activity (mother tongue and autonomous English thinking). The following statistics are the average number of 15 people [ 12 – 14 ].…”
Section: Empirical Researchmentioning
confidence: 99%
“…The following method is the statistical method for the amount of thinking: the proportion of the average number of Chinese/English words in the average total number of words (the total number of Chinese and English words); the distribution of Chinese in each thinking activity (the average number of Chinese words in a thinking activity/the average total number of Chinese words); the proportion of the average number of Chinese/English words in the total amount of thinking in each thinking activity (mother tongue and autonomous English thinking). The following statistics are the average number of 15 people [ 12 – 14 ].…”
Section: Empirical Researchmentioning
confidence: 99%
“…The Process of Optimizing BiLSTM Based on APPSO. Jian et al [27] put forward the variant recurrent neural network short-term memory network, which introduced a gating mechanism to simply and effectively solve the problem of gradient explosion or disappearance of the traditional recurrent neural network. The LSTM controls the information transmission between each cell through the gating mechanism.…”
Section: Data Preprocessingmentioning
confidence: 99%
“…There are quite a lot of ways to use machine learning models. The field of application and experimentation is mostly image processing (Jian et al, 2022). In the economic sphere, this is an analysis of financial markets and quotations of various assets (Kaminsky et al, 2020;Kerr et al, 2021).…”
Section: Introductionmentioning
confidence: 99%