Proceedings of the Conference on Empirical Methods in Natural Language Processing - EMNLP '08 2008
DOI: 10.3115/1613715.1613831
|View full text |Cite
|
Sign up to set email alerts
|

A structured vector space model for word meaning in context

Abstract: We address the task of computing vector space representations for the meaning of word occurrences, which can vary widely according to context. This task is a crucial step towards a robust, vector-based compositional account of sentence meaning. We argue that existing models for this task do not take syntactic structure sufficiently into account. We present a novel structured vector space model that addresses these issues by incorporating the selectional preferences for words' argument positions. This makes it … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
244
0
2

Year Published

2012
2012
2023
2023

Publication Types

Select...
3
3

Relationship

2
4

Authors

Journals

citations
Cited by 242 publications
(247 citation statements)
references
References 39 publications
(61 reference statements)
1
244
0
2
Order By: Relevance
“…We do this by computing vector representations for word meaning in context, and predicting rule applicability based on these contextspecific vectors. We follow the literature on vector space representations for word meaning in context (Erk and Padó, 2008;Thater et al, 2010;Reisinger and Mooney, 2010;Dinu and Lapata, 2010;Van de Cruys et al, 2011) in assuming that a word's context-specific meaning is a function of its out-of-context representation and the context. The context may consist of a single item or multiple items, and (syntactic or semantic) relations to the target word may also play a role (Erk and Padó, 2008;Thater et al, 2010;Van de Cruys et al, 2011).…”
Section: Addressing Polysemymentioning
confidence: 99%
See 4 more Smart Citations
“…We do this by computing vector representations for word meaning in context, and predicting rule applicability based on these contextspecific vectors. We follow the literature on vector space representations for word meaning in context (Erk and Padó, 2008;Thater et al, 2010;Reisinger and Mooney, 2010;Dinu and Lapata, 2010;Van de Cruys et al, 2011) in assuming that a word's context-specific meaning is a function of its out-of-context representation and the context. The context may consist of a single item or multiple items, and (syntactic or semantic) relations to the target word may also play a role (Erk and Padó, 2008;Thater et al, 2010;Van de Cruys et al, 2011).…”
Section: Addressing Polysemymentioning
confidence: 99%
“…This follows common practice in vector space models of word meaning in context of computing a context-specific representation of the target, but not the paraphrase candidate. But if the paraphrase candidate is polysemous, it may be useful to compute a representation for it that is also specific to the sentence context at hand (Erk and Padó, 2010). We can do this by defining a lexical mapping γ P,G specific to predicate P and formula G by γ P,G (Q) = α (Q), κ(P, G) .…”
Section: Addressing Polysemymentioning
confidence: 99%
See 3 more Smart Citations