2021
DOI: 10.1155/2021/7058723
|View full text |Cite
|
Sign up to set email alerts
|

English Grammar Error Detection Using Recurrent Neural Networks

Abstract: Automatic marking of English compositions is a rapidly developing field in recent years. It has gradually replaced teachers’ manual reading and become an important tool to relieve the teaching burden. The existing literature shows that the error of verb consistency and the error of verb tense are the two types of grammatical errors with the highest error rate in English composition. Hence, the detection results of verb errors can reflect the practicability and effectiveness of an automatic reading system. This… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…Truscott adopts a simple-to-complex step-by-step error correction method, using language model to correct simple errors, and word-level Transformer model to correct complex errors [ 16 ]. He proposed a Chinese grammar error correction model based on Transformer enhanced architecture, which uses the dynamic residual structure combined with the outputs of different neural modules to enhance the ability of the model to capture semantic information [ 17 ]. Podymov used the sequence generation model of multilayer convolution and reordered the final generated results by using language model, and it became the first neural error correction model that surpassed statistical machine translation method [ 18 ].…”
Section: Related Workmentioning
confidence: 99%
“…Truscott adopts a simple-to-complex step-by-step error correction method, using language model to correct simple errors, and word-level Transformer model to correct complex errors [ 16 ]. He proposed a Chinese grammar error correction model based on Transformer enhanced architecture, which uses the dynamic residual structure combined with the outputs of different neural modules to enhance the ability of the model to capture semantic information [ 17 ]. Podymov used the sequence generation model of multilayer convolution and reordered the final generated results by using language model, and it became the first neural error correction model that surpassed statistical machine translation method [ 18 ].…”
Section: Related Workmentioning
confidence: 99%
“…When the training sample number is relatively small, the machine learning algorithm can also achieve good regression generalization ability. In the case of linear inseparability, the machine learning algorithm reflects the data into higher dimensional space through kernel function and constructs linear decision function in higher dimensional space to solve the dimension problem [23]. e kernel function determines the complexity of the regression function set, and the algorithm performance is controlled by the learning strategy that embodies the principle of structural risk minimization.…”
Section: Machine Learning Algorithm Principlesmentioning
confidence: 99%
“…e vocabulary that has not yet been acquired will be incorporated into the new thesaurus, and learners will be given repeated recommendations until they have learned the English vocabulary [20].…”
Section: Vocabulary Learning Modulementioning
confidence: 99%