Findings of the Association for Computational Linguistics: EMNLP 2023 2023
DOI: 10.18653/v1/2023.findings-emnlp.773
|View full text |Cite
|
Sign up to set email alerts
|

Addressing the Length Bias Challenge in Document-Level Neural Machine Translation

Zhang Zhuocheng,
Shuhao Gu,
Min Zhang
et al.

Abstract: Document-level neural machine translation (DNMT) has shown promising results by incorporating more context information. However, this approach also introduces a length bias problem, whereby DNMT suffers from significant translation quality degradation when decoding documents that are much shorter or longer than the maximum sequence length during training. To solve the length bias problem, we propose to improve the DNMT model in training method, attention mechanism, and decoding strategy. Firstly, we propose to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 28 publications
(44 reference statements)
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?