2022
DOI: 10.3758/s13423-022-02150-9
|View full text |Cite
|
Sign up to set email alerts
|

The transposed-word effect does not require parallel word processing: Failure to notice transpositions with serial presentation of words

Abstract: Readers sometimes fail to notice word transposition errors, reporting a sentence with two transposed words to be grammatical (the transposed-word effect). It has been suggested that this effect implicates parallel word processing during sentence reading. The current study directly assessed the role of parallel word processing in failure to notice word transposition errors, by comparing error detection under normal sentence presentation conditions and when words are presented serially at 250 ms/word. Extending … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

4
23
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(28 citation statements)
references
References 28 publications
4
23
1
Order By: Relevance
“…One could therefore argue that the fact that we did observe this pattern is evidence against a strictly serial, one word-at-time, account of reading. Nevertheless, noisy channel accounts of sentence processing (e.g., Gibson et al [ 6 ]) do predict that errors, including transposed-word errors, can go unnoticed even under strictly incremental processing, due to the approximate (or good-enough: Ferreira & Lowder [ 9 ]) nature of the sentence-level structures that are computed on the fly (see also Huang & Staub [ 13 ]; Milledge et al [ 14 ]). Moreover, as already proposed in related work from our group (e.g., Dufour et al [ 15 ]; Pegado & Grainger [ 29 ]; Wen et al [ 27 , 30 ]), top-down constraints imposed by sentence-level structures (syntactic and semantic) would contribute to transposed-word effects independently of presentation mode (the same holds for post-lexical integration accounts of transposed-word effects as proposed by Huang & Staub).…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…One could therefore argue that the fact that we did observe this pattern is evidence against a strictly serial, one word-at-time, account of reading. Nevertheless, noisy channel accounts of sentence processing (e.g., Gibson et al [ 6 ]) do predict that errors, including transposed-word errors, can go unnoticed even under strictly incremental processing, due to the approximate (or good-enough: Ferreira & Lowder [ 9 ]) nature of the sentence-level structures that are computed on the fly (see also Huang & Staub [ 13 ]; Milledge et al [ 14 ]). Moreover, as already proposed in related work from our group (e.g., Dufour et al [ 15 ]; Pegado & Grainger [ 29 ]; Wen et al [ 27 , 30 ]), top-down constraints imposed by sentence-level structures (syntactic and semantic) would contribute to transposed-word effects independently of presentation mode (the same holds for post-lexical integration accounts of transposed-word effects as proposed by Huang & Staub).…”
Section: Discussionmentioning
confidence: 99%
“…Since the criterion for an “ungrammatical” response is tied to the value set for a “grammatical” response (cf., the Leaky Competing Accumulator and Drift Diffusion models of lexical decision: Dufau et al [ 32 ]; Ratcliff et al [ 33 ]; see also Perea et al [ 34 ]) this encourages fast ungrammatical responses that are associated with an increase in errors across the complete range of RTs, and particularly for transposed-word sequences (see Fig 9 ). We suspect that the overall greater size of transposed-word effects (in both errors and RTs) seen with parallel presentation compared with serial presentation is due to a combination of an increase in bottom-up positional noise under parallel presentation (see also Huang & Staub [ 13 ]), plus the different ways that participants perform the grammatical decision task with these two procedures.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations