Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.663
|View full text |Cite
|
Sign up to set email alerts
|

What are the Goals of Distributional Semantics?

Abstract: Distributional semantic models have become a mainstay in NLP, providing useful features for downstream tasks. However, assessing long-term progress requires explicit long-term goals. In this paper, I take a broad linguistic perspective, looking at how well current models can deal with various semantic challenges. Given stark differences between models proposed in different subfields, a broad perspective is needed to see how we could integrate them. I conclude that, while linguistic insights can guide the desig… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 127 publications
0
19
0
Order By: Relevance
“…Distributional semanticists have long been aware that grounding distributional representations in the real world is challenging. The lexical similarity relations learned by distributional models trained on text don't in themselves connect any of those words to the world (Herbelot, 2013;Baroni et al, 2014;Erk, 2016;Emerson, 2020), and the distributions of words may not match the distribution of things in the world (consider four-legged dogs).…”
Section: Distributional Semanticsmentioning
confidence: 99%
“…Distributional semanticists have long been aware that grounding distributional representations in the real world is challenging. The lexical similarity relations learned by distributional models trained on text don't in themselves connect any of those words to the world (Herbelot, 2013;Baroni et al, 2014;Erk, 2016;Emerson, 2020), and the distributions of words may not match the distribution of things in the world (consider four-legged dogs).…”
Section: Distributional Semanticsmentioning
confidence: 99%
“…A key question in the development of representations is what aspects of meaning these representations should capture. Indeed, recent reflections have drawn attention to challenges such as polysemy and hyponymy (Emerson, 2020) and construed meaning (Trott et al, 2020). However, even though Bender and Lascarides (2019, p.20) Figure 1: With all three utterances, the author asks how someone is doing, but the spelling variants carry different social meanings.…”
Section: Representing Social Meaningmentioning
confidence: 99%
“…32 NLP systems operate under the distributional hypothesis that words surrounding a word in question give clues to its meaning and when taken in aggregate, all of its contexts appear to give us what we seek. 2 This may especially be not true in technical text although techniques using contextual information have been developed in the automotive industry. 31 The ability to generalize, when a model behaves as expected in novel situations beyond the training context, is closely related to the problem of meaning.…”
Section: Do Algorithms Understand?mentioning
confidence: 99%
“…These were enabled by improvements in language models that predict characters, words, or sentences from surrounding context, which have become a central theme in NLP research. [1][2][3] The foremost example, Generative Pre-trained Transformer 3 (GPT-3), has been dubbed the "most powerful language model ever" 4 and recently demonstrated strong performance on many existing data sets for a variety of NLP tasks such as translation, question answering, unscrambling words, and news article generation. 5 Early users have shown its ability to generate text ranging from guitar tablature, to website layouts, to computer code.…”
mentioning
confidence: 99%