2024
DOI: 10.5715/jnlp.31.3
|View full text |Cite
|
Sign up to set email alerts
|

Bidirectional Transformer Reranker for Grammatical Error Correction

Ying Zhang,
Hidetaka Kamigaito,
Manabu Okumura

Abstract: Pre-trained sequence-to-sequence (seq2seq) models have achieved state-of-the-art results in the grammatical error correction tasks. However, these models are plagued by prediction bias owing to their unidirectional decoding. Thus, this study proposed a bidirectional transformer reranker (BTR) that re-estimates the probability of each candidate sentence generated by the pre-trained seq2seq model. The BTR preserves the seq2seq-style transformer architecture but utilizes a BERT-style self-attention mechanism in t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
references
References 40 publications
(90 reference statements)
0
0
0
Order By: Relevance