2014 International Conference on Asian Language Processing (IALP) 2014
DOI: 10.1109/ialp.2014.6973490
|View full text |Cite
|
Sign up to set email alerts
|

Learning word embeddings from dependency relations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
3
3
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 4 publications
0
7
0
Order By: Relevance
“…In addition, PPMI is conceptually simple and fast to calculate, so we chose it over the newer alternatives. 4 There are some existing word embeddings that take dependency structure into account, for example, Gamallo (2017), Levy and Goldberg (2014a), Padó and Lapata (2007), Rei and Briscoe (2014), and Zhao, Huang, Dai, Zhang, and Chen (2014). However, these embeddings do not encode the information that we need for use in cognitive models of sentence processing.…”
Section: Methodsmentioning
confidence: 99%
“…In addition, PPMI is conceptually simple and fast to calculate, so we chose it over the newer alternatives. 4 There are some existing word embeddings that take dependency structure into account, for example, Gamallo (2017), Levy and Goldberg (2014a), Padó and Lapata (2007), Rei and Briscoe (2014), and Zhao, Huang, Dai, Zhang, and Chen (2014). However, these embeddings do not encode the information that we need for use in cognitive models of sentence processing.…”
Section: Methodsmentioning
confidence: 99%
“…Other work on incorporating syntax into language modeling include Chelba et al (1997) and Pauls and Klein (2012), however none of these approaches considered neural language models, only count-based ones. Levy and Goldberg (2014) and Zhao et al (2014) proposed to train neural word embeddings using skip-grams and CBOWs on dependency parse trees, but did not extend their approach to actual language models such as LBL and RNN and did not evaluate the word embeddings on word completion tasks.…”
Section: Discussionmentioning
confidence: 99%
“…There are some existing word embeddings that take dependency structure into account, e.g., Gamallo (2017), O. Levy and Goldberg (2014a), Padó and Lapata (2007), Rei and Briscoe (2014), Zhao et al (2014). However, these embeddings do not encode the information that we need for use in cognitive models of sentence processing.…”
Section: Methodsmentioning
confidence: 99%