Findings of the Association for Computational Linguistics: EMNLP 2021 2021
DOI: 10.18653/v1/2021.findings-emnlp.79
|View full text |Cite
|
Sign up to set email alerts
|

Exploiting Curriculum Learning in Unsupervised Neural Machine Translation

Abstract: Back-translation (BT) has become one of the de facto components in unsupervised neural machine translation (UNMT), and it explicitly makes UNMT have translation ability. However, all the pseudo bi-texts generated by BT are treated equally as clean data during optimization without considering the quality diversity, leading to slow convergence and limited translation performance. To address this problem, we propose a curriculum learning method to gradually utilize pseudo bi-texts based on their quality from mult… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 26 publications
(9 reference statements)
0
1
0
Order By: Relevance
“…Curriculum Learning. The concept of Curriculum Learning (CL) is initially introduced by (Bengio et al, 2009), and has since manifested its multifaceted advantages across a diverse array of machine learning tasks (Platanios et al, 2019;Wang et al, 2019;Xu et al, 2020;Lu and Zhang, 2021). One of the advantages of CL is denoising, which is achieved by training more time on clean data and less time on noisy data .…”
Section: Related Workmentioning
confidence: 99%
“…Curriculum Learning. The concept of Curriculum Learning (CL) is initially introduced by (Bengio et al, 2009), and has since manifested its multifaceted advantages across a diverse array of machine learning tasks (Platanios et al, 2019;Wang et al, 2019;Xu et al, 2020;Lu and Zhang, 2021). One of the advantages of CL is denoising, which is achieved by training more time on clean data and less time on noisy data .…”
Section: Related Workmentioning
confidence: 99%