Proceedings of the 12th Workshop on Multiword Expressions 2016
DOI: 10.18653/v1/w16-1808
|View full text |Cite
|
Sign up to set email alerts
|

Using Word Embeddings for Improving Statistical Machine Translation of Phrasal Verbs

Abstract: We examine the employment of word embeddings for machine translation (MT) of phrasal verbs (PVs), a linguistic phenomenon with challenging semantics. Using word embeddings, we augment the translation model with two features: one modelling distributional semantic properties of the source and target phrase and another modelling the degree of compositionality of PVs. We also obtain paraphrases to increase the amount of relevant training data. Our method leads to improved translation quality for PVs in a case stud… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 9 publications
(12 reference statements)
0
4
0
Order By: Relevance
“…Bilingual embeddings are vector representations of two languages mapped into shared space, such that translated word pairs have similar vectors (Gouws et al, 2015;. They facilitate applications from parallel sentence extraction (Grover and Mitra, 2017) to machine translation (Zou et al, 2013;Cholakov and Kordoni, 2016;Artetxe et al, 2017b) and can be used to improve monolingual embeddings (Faruqui and Dyer, 2014). Bilingual embeddings are learned via one of three methods: mapping both spaces into a shared space (Mikolov et al, 2013b), monolingual adaptation of one language's embedding space into another's (Zou et al, 2013), or bilingually training both embeddings simultaneously (AP et al, 2014;.…”
Section: Related Workmentioning
confidence: 99%
“…Bilingual embeddings are vector representations of two languages mapped into shared space, such that translated word pairs have similar vectors (Gouws et al, 2015;. They facilitate applications from parallel sentence extraction (Grover and Mitra, 2017) to machine translation (Zou et al, 2013;Cholakov and Kordoni, 2016;Artetxe et al, 2017b) and can be used to improve monolingual embeddings (Faruqui and Dyer, 2014). Bilingual embeddings are learned via one of three methods: mapping both spaces into a shared space (Mikolov et al, 2013b), monolingual adaptation of one language's embedding space into another's (Zou et al, 2013), or bilingually training both embeddings simultaneously (AP et al, 2014;.…”
Section: Related Workmentioning
confidence: 99%
“…Word embeddings have became commonly used in many MWErelated tasks due to their intriguing properties. For instance, word embeddings are user in order to improve the quality of the translation of phrasal verbs in [7].…”
Section: Previous Workmentioning
confidence: 99%
“…It's due to this abundance of FSs in everyday written and spoken discourse that apart from traditional classroom interest in them, FSs are the main focus of interest in different fields, such as computer assisted language learning (CALL) (Stengers, et al, 2014), mobile learning (M-learning) (Hayati, et al, 2013), machine translation (Cholakov, et al, 2014), eye tracking (Siyanova-Chanturia, 2013; Siyanova-Chanturia, et al, 2011; Conklin, 2014, 2015), neurolinguistics (Boulenger, et al, 2012;Zhang, et al, 2013), and brain diseases/damages (Reuterskiöld, and Van Lancker-Sidtis, 2013; Van Lancker-Sidtis, 2006).…”
Section: Literature Reviewmentioning
confidence: 99%