2024
DOI: 10.3390/math12070997
|View full text |Cite
|
Sign up to set email alerts
|

Neural Machine Translation with CARU-Embedding Layer and CARU-Gated Attention Layer

Sio-Kei Im,
Ka-Hou Chan

Abstract: The attention mechanism performs well for the Neural Machine Translation (NMT) task, but heavily depends on the context vectors generated by the attention network to predict target words. This reliance raises the issue of long-term dependencies. Indeed, it is very common to combine predicates with postpositions in sentences, and the same predicate may have different meanings when combined with different postpositions. This usually poses an additional challenge to the NMT study. In this work, we observe that th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
references
References 54 publications
0
0
0
Order By: Relevance