2008
DOI: 10.1080/03640210701802071
|View full text |Cite
|
Sign up to set email alerts
|

A Rational Analysis of Rule‐Based Concept Learning

Abstract: This article proposes a new model of human concept learning that provides a rational analysis of learning feature-based concepts. This model is built upon Bayesian inference for a grammatically structured hypothesis space-a concept language of logical rules. This article compares the model predictions to human generalization judgments in several well-known category learning experiments, and finds good agreement for both average and individual participant generalizations. This article further investigates judgm… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

16
376
1
1

Year Published

2009
2009
2018
2018

Publication Types

Select...
5
5

Relationship

4
6

Authors

Journals

citations
Cited by 316 publications
(394 citation statements)
references
References 62 publications
16
376
1
1
Order By: Relevance
“…The most likely candidates are SUSTAIN, which, given its representation of concepts in terms of separable clusters, shares a broad conceptual similarity with the model theory, and algebraic complexity, which Feldman (2006) describes as the analytic counterpart of SUSTAIN. A recent rule-based approach to concept learning is due to Goodman, Tenenbaum, Feldman, and Griffiths (2008). It uses rational rules and also has some similarity to the model theory.…”
Section: Discussionmentioning
confidence: 99%
“…The most likely candidates are SUSTAIN, which, given its representation of concepts in terms of separable clusters, shares a broad conceptual similarity with the model theory, and algebraic complexity, which Feldman (2006) describes as the analytic counterpart of SUSTAIN. A recent rule-based approach to concept learning is due to Goodman, Tenenbaum, Feldman, and Griffiths (2008). It uses rational rules and also has some similarity to the model theory.…”
Section: Discussionmentioning
confidence: 99%
“…Our framework has room for many kinds of structures and stochastic processes, and these components can be combined to capture many kinds of background knowledge and to provide many different inductive biases. We described four specific applications of our framework, but our framework also has room for models that rely on theories more complex than any of the examples presented here, including theories formulated using grammars (N. D. Goodman, 57 Tenenbaum, Feldman, & Griffiths, 2008) or logical representations (Kemp, Goodman, & Tenenbaum, 2008). …”
Section: Specific Modelsmentioning
confidence: 99%
“…Nosofsky, Palemeri, & McKinley, 1994;Goodman, Tenenbaum, Feldman, & Griffiths, 2008), inferring prototypes (e.g. Posner & Keele, 1968), storing exemplars (e.g.…”
Section: Previous Approaches To Category Learningmentioning
confidence: 99%