2018
DOI: 10.4000/ijcol.531
|View full text |Cite
|
Sign up to set email alerts
|

Multilingual Neural Machine Translation for Low-Resource Languages

Abstract: In recent years, Neural Machine Translation (NMT) has been shown to be more effective than phrase-based statistical methods, thus quickly becoming the state of the art in machine translation (MT). However, NMT systems are limited in translating low-resourced languages, due to the significant amount of parallel data that is required to learn useful mappings between languages. In this work, we show how the so-called multilingual NMT can help to tackle the challenges associated with low-resourced language transla… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
19
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 34 publications
(20 citation statements)
references
References 43 publications
1
19
0
Order By: Relevance
“…Moreover, the attention mechanism of the many-toone strategy is fine-tuned by the generated pseudo parallel corpus. Following google's multilingual NMT, Lakew et al (2017) proposed the self-learning algorithm for zero-shot NMT, which is a process of constantly iterating through a train-infer-train mechanism to generate synthetic parallel data for zero-shot translation. By using this way, the quality of the generated parallel data is significantly improved.…”
Section: Zero-shot Translationmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, the attention mechanism of the many-toone strategy is fine-tuned by the generated pseudo parallel corpus. Following google's multilingual NMT, Lakew et al (2017) proposed the self-learning algorithm for zero-shot NMT, which is a process of constantly iterating through a train-infer-train mechanism to generate synthetic parallel data for zero-shot translation. By using this way, the quality of the generated parallel data is significantly improved.…”
Section: Zero-shot Translationmentioning
confidence: 99%
“…Particularly by using this way, the multilingual NMT system has possibility to perform zero-shot translation through sharing the common model. In order to improve the performance of zero-shot languages, Lakew et al (2017) proposed the selflearning algorithm, which generates synthetic parallel data by translating existing target data through the multilingual NMT round and round. The whole process is a self-learning cycle of train-infer-train.…”
Section: Introductionmentioning
confidence: 99%
“…Inspired by multilingual NMT, Gu et al improved translation quality for tiny even zero-resource parallel corpus by sharing a universal word-level representation and sentence-level representation [34]. For zero-resource parallel corpus, Lakew et al generated new synthetic data by the traininginference-training scheme, which is based on a multi-NMT system [35]. Xia et al proposed a dual-learning algorithm called dual-NMT to tackle the training data bottleneck, which teach each other by giving feedback signals [36].…”
Section: Related Workmentioning
confidence: 99%
“…Particularly, multilingual NMT models using English-centric parallel corpora have shown significant improvements for translation between English and low-resources languages (Firat, Cho, and Bengio 2016;Johnson et al 2017). Translation between non-English languages has received lesser attention, with the default approach being pivot translation (Lakew et al 2017). Pivot translation is a strong baseline, but needs multiple decoding steps resulting in increased latency and cascading errors.…”
Section: Introductionmentioning
confidence: 99%
“…Hence, vanilla zeroshot translation quality significantly lags behind pivot translation. Various methods have been proposed to address these limitations by aligning encoder representations (Arivazhagan et al 2019) or using pseudo-parallel corpus between non-English languages during training (Lakew et al 2017).…”
Section: Introductionmentioning
confidence: 99%