Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1086
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic Past and Future for Neural Machine Translation

Abstract: Previous studies have shown that neural machine translation (NMT) models can benefit from explicitly modeling translated (PAST) and untranslated (FUTURE) source contents as recurrent states (Zheng et al., 2018). However, this less interpretable recurrent process hinders its power to model the dynamic updating of PAST and FUTURE contents during decoding. In this paper, we propose to model the dynamic principles by explicitly separating source words into groups of translated and untranslated contents through par… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
17
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(17 citation statements)
references
References 24 publications
0
17
0
Order By: Relevance
“…Translation Quality The results on the EN→DE, DE→EN and ZH→EN are shown in Table 1. For a fair comparison, we also report several Transformer baseline from previous work (Vaswani et al 2017;Zheng et al 2019;Dou et al 2018). Our Transformer baseline achieves similar or better results comparing with them.…”
Section: Resultsmentioning
confidence: 99%
“…Translation Quality The results on the EN→DE, DE→EN and ZH→EN are shown in Table 1. For a fair comparison, we also report several Transformer baseline from previous work (Vaswani et al 2017;Zheng et al 2019;Dou et al 2018). Our Transformer baseline achieves similar or better results comparing with them.…”
Section: Resultsmentioning
confidence: 99%
“…Several Transformer systems with the same settings (Vaswani et al, 2017;Hassan et al, 2018;Gu et al, 2017) are reported as a comparison (line 1-6). Then, several related researches about improve faithfulness of NMT (Kong et al, 2019;Zheng et al, 2019;Feng et al, 2020) or exploiting translations for improving NMT (Xia et al, 2017;) also be reported (line 7-12). We implement three comparable approaches on our Transformer baseline, including: 1).…”
Section: Automatic Evaluationmentioning
confidence: 96%
“…proposed to model global representation in the source side to improve the source representation. Zheng et al (2019) proposed a capsule based module to control the source representation dynamically in the decoding process. ), Feng et al (2020 and Garg et al (2019) proposed to introduce word alignment information in Transformer to improve translation accuracy.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Therefore, we construct a high quality annotated corpus (TransErr) comprising 15000 Chinese-English translation pairs with inter-annotator agreement at 0.804 measured by Cohen's Kappa (Cohen, 1960). Different from existing error detection works which focus on all error classes, we currently only take care of missing and wrong translation , the major errors related to adequacy, which is a wide-known issue in neural machine translation (NMT) (Zheng et al, 2019). The errors tags are annotated on source (Chinese) sentences to reflect the loyalty and adequacy with respect to the source.…”
Section: Introductionmentioning
confidence: 99%