PsycEXTRA Dataset 2013
DOI: 10.1037/e633262013-853
|View full text |Cite
|
Sign up to set email alerts
|

How Do PDP Models Learn Quasiregularity?

Abstract: Parallel Distributed Processing (PDP) models have had a profound impact on the study of cognition. One domain in which they have been particularly influential is quasiregular learning, in which mastery requires both learning regularities that capture the majority of the structure in the input plus learning exceptions that violate the regularities. How PDP models learn quasiregularity is still not well understood. Small-and large-scale analyses of a feedforward, three-layer network were carried out to address t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
9
0

Year Published

2017
2017
2021
2021

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(9 citation statements)
references
References 13 publications
0
9
0
Order By: Relevance
“…The learning mechanism in which the representations of exceptions are warped is responsible for the reduced ability to generalize. Warping is highly local to neighbors of the anchor (Kim et al, 2013), which is why generalization of the newly learned pronunciation is low and regularization itself is high.…”
Section: Warping 25mentioning
confidence: 99%
See 4 more Smart Citations
“…The learning mechanism in which the representations of exceptions are warped is responsible for the reduced ability to generalize. Warping is highly local to neighbors of the anchor (Kim et al, 2013), which is why generalization of the newly learned pronunciation is low and regularization itself is high.…”
Section: Warping 25mentioning
confidence: 99%
“…Warping 5 To better understand how internal representations in the Plaut et al (1996) model made it both generalize appropriately and learn exceptions, Kim, Pitt, and Myung (2013) performed a series of simulations to flesh out how representations of regularities and exceptions were organized in that model. Intuitively, the characteristics of the representations of regular and exception pronunciations would seem to be at odds: learning an exception like pint should hinder the model's ability to generalize its knowledge of how int should be pronounced when encountering new words (e.g., kint, bint, gint).…”
Section: Warpingmentioning
confidence: 99%
See 3 more Smart Citations