Proceedings of the 2015 Conference of the North American Chapter of the Association for Computational Linguistics: Human Langua 2015
DOI: 10.3115/v1/n15-1050
|View full text |Cite
|
Sign up to set email alerts
|

Modeling Word Meaning in Context with Substitute Vectors

Abstract: Context representations are a key element in distributional models of word meaning. In contrast to typical representations based on neighboring words, a recently proposed approach suggests to represent a context of a target word by a substitute vector, comprising the potential fillers for the target word slot in that context. In this work we first propose a variant of substitute vectors, which we find particularly suitable for measuring context similarity. Then, we propose a novel model for representing word m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
48
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(48 citation statements)
references
References 13 publications
0
48
0
Order By: Relevance
“…As optimization metric we have explored both NDCG@10 and MAP. The NDCG metric can incorporate different scoring weights 8 The omission of MWE by multiple authors has been confirmed by the authors of (Melamud et al, 2015a).…”
Section: Transfer Learningmentioning
confidence: 93%
See 2 more Smart Citations
“…As optimization metric we have explored both NDCG@10 and MAP. The NDCG metric can incorporate different scoring weights 8 The omission of MWE by multiple authors has been confirmed by the authors of (Melamud et al, 2015a).…”
Section: Transfer Learningmentioning
confidence: 93%
“…We adapt the unsupervised approach by (Melamud et al, 2015a) as a set of features. We follow (Levy and Goldberg, 2014) to construct dependency-based word embeddings; we obtain syntactic contexts by running a syntactic dependency parser 6 , and computing word embeddings using dependency edges as context features 7 .…”
Section: Syntactic Word Embeddingsmentioning
confidence: 99%
See 1 more Smart Citation
“…So, they cannot be merged. To combine them, on the basis of previous work (Thater et al, 2010;Erk and Padó, 2008;Melamud et al, 2015), we distinguish between direct denotation and selectional preferences within a dependency relation. Our approach is an attempt to join the main ideas of these syntax-based and structured vector space models into an entirely compositional model.…”
Section: Related Workmentioning
confidence: 99%
“…From the set of assertions extracted in §3.2, we create a dataset of relation-argument pairs, and use word2vecf to train the embeddings. We also tried to use the arguments' embeddings to induce a contextsensitive measure of similarity, as suggested by Melamud et al (2015); however, this method did not improve performance on our dataset.…”
Section: Entailment Graphmentioning
confidence: 99%