2021
DOI: 10.48550/arxiv.2109.05712
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Contrastive Learning for Context-aware Neural Machine TranslationUsing Coreference Information

Yongkeun Hwang,
Hyungu Yun,
Kyomin Jung

Abstract: Context-aware neural machine translation (NMT) incorporates contextual information of surrounding texts, that can improve the translation quality of document-level machine translation. Many existing works on context-aware NMT have focused on developing new model architectures for incorporating additional contexts and have shown some promising results. However, most existing works rely on crossentropy loss, resulting in limited use of contextual information. In this paper, we propose CorefCL, a novel data augme… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 25 publications
0
1
0
Order By: Relevance
“…In addition to the straightforward training on newly constructed sets, we also train models with contrastive learning, which is inspired by previous works Hwang et al, 2021;Hu and Li, 2022) that recognize the effectiveness of contrastive learning in improving the robustness of NLP and NMT models. By employing this method, we can analyze the performance of C-MTNT on a wider range of models trained with different approaches and settings.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…In addition to the straightforward training on newly constructed sets, we also train models with contrastive learning, which is inspired by previous works Hwang et al, 2021;Hu and Li, 2022) that recognize the effectiveness of contrastive learning in improving the robustness of NLP and NMT models. By employing this method, we can analyze the performance of C-MTNT on a wider range of models trained with different approaches and settings.…”
Section: Contrastive Learningmentioning
confidence: 99%