Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Confere 2015
DOI: 10.3115/v1/p15-1119
|View full text |Cite
|
Sign up to set email alerts
|

Cross-lingual Dependency Parsing Based on Distributed Representations

Abstract: This paper investigates the problem of cross-lingual dependency parsing, aiming at inducing dependency parsers for low-resource languages while using only training data from a resource-rich language (e.g. English). Existing approaches typically don't include lexical features, which are not transferable across languages. In this paper, we bridge the lexical feature gap by using distributed feature representations and their composition. We provide two algorithms for inducing cross-lingual distributed representat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
162
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 163 publications
(163 citation statements)
references
References 23 publications
(27 reference statements)
1
162
0
Order By: Relevance
“…Previous work has shown its effectiveness across a wide range of multilingual transfer tasks including tagging (Kim et al, 2015), syntactic parsing (Xiao and Guo, 2014;Guo et al, 2015;Durrett et al, 2012), and machine translation (Zou et al, 2013;Mikolov et al, 2013b). However, these approaches commonly require parallel sentences or bilingual lexicon to learn multilingual embeddings.…”
Section: Multilingual Word Embeddingsmentioning
confidence: 99%
“…Previous work has shown its effectiveness across a wide range of multilingual transfer tasks including tagging (Kim et al, 2015), syntactic parsing (Xiao and Guo, 2014;Guo et al, 2015;Durrett et al, 2012), and machine translation (Zou et al, 2013;Mikolov et al, 2013b). However, these approaches commonly require parallel sentences or bilingual lexicon to learn multilingual embeddings.…”
Section: Multilingual Word Embeddingsmentioning
confidence: 99%
“…2). SBWES may be used to support many tasks, e.g., computing cross-lingual/multilingual semantic word similarity (Faruqui and Dyer, 2014), learning bilingual word lexicons (Mikolov et al, 2013a;Gouws et al, 2015;, cross-lingual entity linking (Tsai and Roth, 2016), parsing (Guo et al, 2015;Johannsen et al, 2015), machine translation (Zou et al, 2013), or crosslingual information retrieval (Vulić and Moens, 2015;Mitra et al, 2016).…”
Section: Monolingual Vs Bilingualmentioning
confidence: 99%
“…We thus include word features using bilingual dictionaries -i.e. translating the words used as features into a single language (English) -, or through crosslingual word embeddings as proposed in (Guo et al, 2015) for dependency parsing. More precisely, we used the cross-lingual word representations presented in (Levy et al, 2017) that allow multi-source learning and have proven useful for POS tagging but also more semantic-oriented tasks, such as dependency parsing and document classification.…”
Section: Cross-lingual Discourse Parsingmentioning
confidence: 99%