Proceedings of the Fifth Workshop on Cognitive Modeling and Computational Linguistics 2014
DOI: 10.3115/v1/w14-2002
|View full text |Cite
|
Sign up to set email alerts
|

Investigating the role of entropy in sentence processing

Abstract: We outline four ways in which uncertainty might affect comprehension difficulty in human sentence processing. These four hypotheses motivate a self-paced reading experiment, in which we used verb subcategorization distributions to manipulate the uncertainty over the next step in the syntactic derivation (single step entropy) and the surprisal of the verb's complement. We additionally estimate wordby-word surprisal and total entropy over parses of the sentence using a probabilistic context-free grammar (PCFG). … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
11
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 14 publications
(12 citation statements)
references
References 21 publications
1
11
0
Order By: Relevance
“…Surprisal is often used to quantify the processing cost of linguistic units, as a reliable correlation has been established in that a more surprising and more informative word or utterance takes more time to read and/or more effort to process (Demberg and Keller, 2008; Smith and Levy, 2013). A similar connection has been found between effort and the concept of entropy reduction, according to which a more informative word reduces uncertainty about the entire sentence structure more sand hence might take longer to read (Linzen and Jaeger, 2014). This concept may also account for recent findings by Maess et al (2016).…”
Section: Surprisal and Entropy Reduction In The Vwpsupporting
confidence: 56%
“…Surprisal is often used to quantify the processing cost of linguistic units, as a reliable correlation has been established in that a more surprising and more informative word or utterance takes more time to read and/or more effort to process (Demberg and Keller, 2008; Smith and Levy, 2013). A similar connection has been found between effort and the concept of entropy reduction, according to which a more informative word reduces uncertainty about the entire sentence structure more sand hence might take longer to read (Linzen and Jaeger, 2014). This concept may also account for recent findings by Maess et al (2016).…”
Section: Surprisal and Entropy Reduction In The Vwpsupporting
confidence: 56%
“…If the P600 indeed reflects syntactic reanalysis, we could therefore have seen surprisal effects on the P600. Even an entropy-reduction effect could not have been excluded in advance, considering that Hale (2003) and Linzen and Jaeger (2014) demonstrate that some garden paths can be viewed as effects of entropy reduction rather then surprisal. However, the P600 has also been found in cases that do not involve increased syntactic processing difficulty (e.g., Hoeks, Stowe, & Doedens, 2004;Kuperberg, Kreher, Sitnikova, Caplan, & Holcomb, 2007;Regel, Gunter, & Friederici, 2011;Van Berkum, Koornneef, Otten, & Nieuwland, 2007).…”
Section: Late Positivitiesmentioning
confidence: 93%
“…This study also investigates the extent to which non-DLT measures of processing complexity can predict syntactic choice. While DLT predicts an influence from the length of dependencies, increased memory load may also reduce 3 For a discussion of embedding depth as part of the prediction process, please refer to Linzen and Jaeger (2014;. the amount of resources available to process language (Chomsky & Miller 1963, Schuler, AbdelRahman, Miller, & Schwartz 2010, Yngve 1960.…”
Section: Introductionmentioning
confidence: 99%