2019
DOI: 10.3390/e21121159
|View full text |Cite
|
Sign up to set email alerts
|

Semantic Entropy in Language Comprehension

Abstract: Language is processed on a more or less word-by-word basis, and the processing difficulty induced by each word is affected by our prior linguistic experience as well as our general knowledge about the world. Surprisal and entropy reduction have been independently proposed as linking theories between word processing difficulty and probabilistic language models. Extant models, however, are typically limited to capturing linguistic experience and hence cannot account for the influence of world knowledge. A recent… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
23
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
2
1

Relationship

2
6

Authors

Journals

citations
Cited by 25 publications
(23 citation statements)
references
References 45 publications
0
23
0
Order By: Relevance
“…That is, the more likely the interpretation at t given the interpretation at t − 1, the lower the Surprisal induced by word w t (see Venhuizen et al, 2019b , for a similar DSS-derived conceptualization of Entropy).…”
Section: A Neurocomputational Modelmentioning
confidence: 99%
“…That is, the more likely the interpretation at t given the interpretation at t − 1, the lower the Surprisal induced by word w t (see Venhuizen et al, 2019b , for a similar DSS-derived conceptualization of Entropy).…”
Section: A Neurocomputational Modelmentioning
confidence: 99%
“…Recent work has also started to integrate world knowledge and linguistic experience in models of online sentence comprehension, including both surprisal (Venhuizen, Crocker, & Brouwer, 2019a) and entropy reduction (Venhuizen, Crocker, & Brouwer, 2019b). The modeling framework described in this paper is flexible enough to include discourse information as an ingredient, for example, by quantifying the expectation of ongoing discourse referents as a degree of uncertainty.…”
Section: Toward Discourse‐informed Predictionsmentioning
confidence: 99%
“…As a formal index of paradigmatic variability we use entropy, which measures the contribution of linguistic units (e.g., words) in predicting linguistic choice in bits of information. Using entropy provides us with a link to a communicative interpretation, as it is a well-established measure of communicative efficiency with implications for cognitive processing (Linzen and Jaeger, 2016;Venhuizen et al, 2019); also, entropy is negatively correlated with distance in (word embedding) spaces which in turn shows cognitive reflexes in certain language processing tasks (Mitchel et al, 2008;Auguste et al, 2017). In terms of domain we focus on science, looking at the diachronic development of scientific English from the 17th century to modern time.…”
mentioning
confidence: 99%