2016
DOI: 10.48550/arxiv.1612.02482
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Improving the Performance of Neural Machine Translation Involving Morphologically Rich Languages

Abstract: The advent of the attention mechanism in neural machine translation models has improved the performance of machine translation systems by enabling selective lookup into the source sentence. In this paper, the efficiencies of translation using bidirectional encoder attention decoder models were studied with respect to translation involving morphologically rich languages. The English-Tamil language pair was selected for this analysis. First, the use of Word2Vec embedding for both the English and Tamil words impr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 14 publications
0
1
0
Order By: Relevance
“…8.1). Compound or morpheme splitting [238,239] can mitigate this issue only to a certain extent. More importantly, a fully-trained NMT system even with a very large vocabulary cannot be extended with new words.…”
Section: Character-based Nmtmentioning
confidence: 99%
“…8.1). Compound or morpheme splitting [238,239] can mitigate this issue only to a certain extent. More importantly, a fully-trained NMT system even with a very large vocabulary cannot be extended with new words.…”
Section: Character-based Nmtmentioning
confidence: 99%