2014
DOI: 10.33011/lilt.v9i.1321
|View full text |Cite
|
Sign up to set email alerts
|

Frege in Space: A Program for Compositional Distributional Semantics

Abstract: The lexicon of any natural language encodes a huge number of distinct word meanings. Just to understand this article, you will need to know what thousands of words mean. The space of possible sentential meanings is infinite: In this article alone, you will encounter many sentences that express ideas you have never heard before, we hope. Statistical semantics has addressed the issue of the vastness of word meaning by proposing methods to harvest meaning automatically from large collections of text (corpora). Fo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
78
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 152 publications
(80 citation statements)
references
References 115 publications
(128 reference statements)
1
78
0
1
Order By: Relevance
“…This choice of operator meant that the same representation could be used for a collection of words in a document, irrespective of the order in which the words appear. By contrast, discrete logical models used in formal semantics have for many decades been quite explicit about the ways words should be combined, but were often notably silent about what those words mean in themselves (see Widdows (2008); Baroni, Bernardi, Zamparelli, et al (2014) for surveys of this methodological difference between traditions). This history of two modelling frameworks led to an unnecessary and unfortunate gap: there are many interesting product operations such as the tensor product between vectors that are well-established in linear algebra, but for years there was relatively little awareness of these alternatives in language research.…”
Section: Explicit Composition With Vectors and Tensors In Aimentioning
confidence: 99%
See 1 more Smart Citation
“…This choice of operator meant that the same representation could be used for a collection of words in a document, irrespective of the order in which the words appear. By contrast, discrete logical models used in formal semantics have for many decades been quite explicit about the ways words should be combined, but were often notably silent about what those words mean in themselves (see Widdows (2008); Baroni, Bernardi, Zamparelli, et al (2014) for surveys of this methodological difference between traditions). This history of two modelling frameworks led to an unnecessary and unfortunate gap: there are many interesting product operations such as the tensor product between vectors that are well-established in linear algebra, but for years there was relatively little awareness of these alternatives in language research.…”
Section: Explicit Composition With Vectors and Tensors In Aimentioning
confidence: 99%
“…For example, Baroni and Zamparelli (2010) used this approach to model the action of adjectives upon nouns, and Socher, Huval, Manning, and Ng (2012) took it to the logical destination of representing each internal node in a parse tree as a matrix operator acting upon its input arguments. This area has become known as Compositional Distributional Semantics, summarized in works like Baroni et al (2014), and work in this area has continued, an example being the work of Sadrzadeh, Kartsaklis, and Balkır (2018) on sentence entailment in this framework.…”
Section: Explicit Composition With Vectors and Tensors In Aimentioning
confidence: 99%
“…Previous approaches that aimed to combine the strengths of formal and distributional semantics have tried to do so by either expanding distributional approaches to account for propositional-level inferences [13,15,16,17], or, conversely, by expanding formal approaches with a distributional component to account for lexical-level similarity [20,21,22]. By contrast, the DFS framework fundamentally integrates the distributional hypothesis into a formal semantic model, while maintaining the proposition-central perspective on meaning.…”
Section: Dfs Distributional Semantics Offer Complementary Meaning Rep...mentioning
confidence: 99%
“…Indeed, while formal semantics focuses on proposition-level (sentence) meanings and distributional semantics focuses on the level of words, there has been considerable interest in bringing together the strengths of both approaches within a single formalism. This has been attempted, for instance, by defining a notion of composition on top of the distributional representations, using vector operations [10], or by using more complex structures (e.g., matrices and tensors) in addition to vectors to represent lexical expressions [11,12,13,14,15,16,17]. Although this has been shown to produce interesting results when applied to adjective-noun modification [18], the approach has difficulties in representing the meaning of complex, multi-argument expressions (see [19,7] for reviews).…”
Section: Introductionmentioning
confidence: 99%
“…More recent contributions can be essentially divided into two separate trends. The former attempts to model 'Fregean compositionality' in vector space, and aimes at finding progressively more sophisticated compositional operations to derive sentence representations from the vectors of the words composing them (Baroni et al 2013, Paperno et al 2014). In the latter trend, dense vectors for sentences are learned as a whole, in a similar way to neural word embeddings (Mikolov et al 2013, Levy andGoldberg 2014): for example, the encoder-decoder models of works like Kiros et al (2015) and Hill et al (2016) are trained to predict, given a sentence vector, the vectors of the surrounding sentences.…”
Section: Sentence Meaning In Vector Spacesmentioning
confidence: 99%