2021
DOI: 10.1037/rev0000327
|View full text |Cite
|
Sign up to set email alerts
|

Serial memory: Putting chains and position codes in context.

Abstract: From the beginning of research on serial memory, chaining theories and position coding theories have been pitted against each other. The central question is whether items are associated with each other or with a set of position codes that are independent of the items. Around the turn of this century, the debate focused on serial recall tasks and patterns of error data that chaining models could not accommodate. Consequently, theories based on other ideas flourished and position coding models became prominent. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
26
1

Year Published

2022
2022
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(28 citation statements)
references
References 71 publications
1
26
1
Order By: Relevance
“…There is recent work by Logan and Cox (2021) that fits exactly with our suggestions of approximating position codes using other means. In their Theoretical Note, they found three different ways in which position codes could be derived from CRU’s context vectors, including summing over the elements from each context vector, calculating similarities between the start and end contexts and each context vector on the list, along with updating CRU’s context vectors with “generic” contexts.…”
Section: Is There Evidence For Associations Between Items?supporting
confidence: 75%
See 1 more Smart Citation
“…There is recent work by Logan and Cox (2021) that fits exactly with our suggestions of approximating position codes using other means. In their Theoretical Note, they found three different ways in which position codes could be derived from CRU’s context vectors, including summing over the elements from each context vector, calculating similarities between the start and end contexts and each context vector on the list, along with updating CRU’s context vectors with “generic” contexts.…”
Section: Is There Evidence For Associations Between Items?supporting
confidence: 75%
“…Instead, our stance is that CRU should either broaden its representation to include within-list position or alternatively to consider how its representations can be employed to approximate representations of within-list position. Some of this work has been undertaken recently within the CRU architecture—in work that was published after the submission of this commentary, Logan and Cox (2021) demonstrated various ways in which positional representations could be derived from CRU’s context representations. We return to this issue later in the section Is There Evidence for Associations Between Items?…”
Section: Representations Of Serial Ordermentioning
confidence: 99%
“…However, modeling of LTM often substitutes position cues for context cues—the overall contents of the mind that accompany the presentation of an item and which include details about the environment and also thoughts elicited by processing the item itself (Howard & Kahana, 2002). Recent work by Logan (2021; see also Logan & Cox, 2021) underscores that position cues in models of WM may in fact be an example of a broader class of context cues as defined in conceptual frameworks of LTM (see Howard & Kahana, 2002). This stance follows from previous work on WM in which the role of context cues has been implicated, accounting in particular for the patterns of errors in the immediate serialrecall task (Unsworth & Engle, 2006).…”
mentioning
confidence: 99%
“…One possibility is to bind each event in the sequence with its temporal order in the sequence. This can be implemented via a contextual drift process with a context representation that evolves as items in the sequence are retrieved as in the Temporal Context Model of free recall of lists [2] and similarly in the Context Retrieval and Updating model [39, 40]. Here we use temporal proximity as the contextual cue by interpreting a sequence as a set of binary associations between temporally proximal events.…”
Section: Storage and Retrieval Of Sequencesmentioning
confidence: 99%