Findings of the Association for Computational Linguistics: EMNLP 2021 2021
DOI: 10.18653/v1/2021.findings-emnlp.396
|View full text |Cite
|
Sign up to set email alerts
|

Secoco: Self-Correcting Encoding for Neural Machine Translation

Abstract: This paper presents Self-correcting Encoding (Secoco), a framework that effectively deals with input noise for robust neural machine translation by introducing self-correcting predictors. Different from previous robust approaches, Secoco enables NMT to explicitly correct noisy inputs and delete specific errors simultaneously with the translation decoding process. Secoco is able to achieve significant improvements of 1.6 BLEU points over strong baselines on two real-world test sets and a benchmark WMT dataset w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 22 publications
0
1
0
Order By: Relevance
“…Other methods include the construction of source and target side adversarial examples [11] for noisy NMT. Subsequently, Insertion and Deletion predictors were added to the NMT, which improved results on noisy inputs [73]. It was observed that multi-tasking with translation and denoising objectives helps to generate robust translation [56].…”
Section: Noisy Neural Machine Translationmentioning
confidence: 99%
“…Other methods include the construction of source and target side adversarial examples [11] for noisy NMT. Subsequently, Insertion and Deletion predictors were added to the NMT, which improved results on noisy inputs [73]. It was observed that multi-tasking with translation and denoising objectives helps to generate robust translation [56].…”
Section: Noisy Neural Machine Translationmentioning
confidence: 99%