2013
DOI: 10.1111/tops.12025
|View full text |Cite
|
Sign up to set email alerts
|

Uncertainty Reduction as a Measure of Cognitive Load in Sentence Comprehension

Abstract: The entropy-reduction hypothesis claims that the cognitive processing difficulty on a word in sentence context is determined by the word's effect on the uncertainty about the sentence. Here, this hypothesis is tested more thoroughly than has been done before, using a recurrent neural network for estimating entropy and self-paced reading for obtaining measures of cognitive processing load. Results show a positive relation between reading time on a word and the reduction in entropy due to processing that word, s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
111
1

Year Published

2013
2013
2024
2024

Publication Types

Select...
6
2
2

Relationship

1
9

Authors

Journals

citations
Cited by 117 publications
(113 citation statements)
references
References 34 publications
(56 reference statements)
1
111
1
Order By: Relevance
“…In related work, Frank (2013) tested a version of the entropy reduction hypothesis whereby entropy reduction was not bounded by 0 (was allowed to take negative values). A Simple Recurrent Network was used to predict the next four words in the sentence; the uncertainty following the current word was estimated as the entropy of this quadrigram distribution.…”
Section: Discussionmentioning
confidence: 99%
“…In related work, Frank (2013) tested a version of the entropy reduction hypothesis whereby entropy reduction was not bounded by 0 (was allowed to take negative values). A Simple Recurrent Network was used to predict the next four words in the sentence; the uncertainty following the current word was estimated as the entropy of this quadrigram distribution.…”
Section: Discussionmentioning
confidence: 99%
“…This same pattern is also part of the empirical support for the minimize domains (MiD) principle (Hawkins, 2004, Section 7.2 There is no necessary connection between Entropy Reduction as a complexity metric and Minimalist Grammars as a formalism. In fact, Frank (2013) recently applied Entropy Reduction to the analysis of British readers' eye fixation times using Simple Recurrent Nets as substitute for a grammar. In this study, Entropy Reduction emerged as a significant predictor of fixation duration.…”
Section: Entropy Reduction As a Complexity Metricmentioning
confidence: 99%
“…Two GAMMs were fitted: one for reading latencies on the TRY verb, and another for reading latencies on the following infinitive verb to test for spill-over effects (Just et al, 1982;Frank, 2013). For the TRY verb, the mean reading latency was 798.78 ms (Mdn = 713.63 ms, s = 334.71 ms), while for the infinitive verb the mean reading latency was 783.52 ms (Mdn = 660.96 ms, s = 396.03 ms).…”
Section: Resultsmentioning
confidence: 99%