Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation 2014
DOI: 10.3115/v1/w14-4001
|View full text |Cite
|
Sign up to set email alerts
|

Vector Space Models for Phrase-based Machine Translation

Abstract: This paper investigates the application of vector space models (VSMs) to the standard phrase-based machine translation pipeline. VSMs are models based on continuous word representations embedded in a vector space. We exploit word vectors to augment the phrase table with new inferred phrase pairs. This helps reduce out-of-vocabulary (OOV) words. In addition, we present a simple way to learn bilingually-constrained phrase vectors. The phrase vectors are then used to provide additional scoring of phrase pairs, wh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
1

Relationship

2
3

Authors

Journals

citations
Cited by 7 publications
(7 citation statements)
references
References 23 publications
0
6
0
Order By: Relevance
“…Cross-lingual word embedding is studied for the usages in MT as follows. In phrase-based SMT, Alkhouli et al (2014) builds translation models with word/phrase embeddings. Kim et al (2018) uses cross-lingual word embedding as a basic translation model for unsupervised MT and attach other components on top of it.…”
Section: Related Workmentioning
confidence: 99%
“…Cross-lingual word embedding is studied for the usages in MT as follows. In phrase-based SMT, Alkhouli et al (2014) builds translation models with word/phrase embeddings. Kim et al (2018) uses cross-lingual word embedding as a basic translation model for unsupervised MT and attach other components on top of it.…”
Section: Related Workmentioning
confidence: 99%
“…Following the work in Mikolov et al (2013b) and Alkhouli et al (2014), we introduce an additional feature in the translation model:…”
Section: Semantic Scoring Featurementioning
confidence: 99%
“…Following the work of Mikolov et al (2013a), Mikolov et al (2013b), and Alkhouli et al (2014), we exploit the idea that vector representations of similar words in different languages are related by a linear transformation. However, we focus on exploring this idea on a specific phenomenon with challenging semantics, namely PVs.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Their model reaches comparable results to Saluja et al (2014) while works faster. Alkhouli et al (2014) use neural network phrase representation for paraphrasing OOVs and find translation for them using a phrase-table created from limited parallel data. Our experimental settings is different from the approaches in (Alkhouli et al, 2014;Mikolov et al, 2013a;Mikolov et al, 2013b).…”
Section: Related Workmentioning
confidence: 99%