2019
DOI: 10.1111/cogs.12724
|View full text |Cite
|
Sign up to set email alerts
|

Do Additional Features Help or Hurt Category Learning? The Curse of Dimensionality in Human Learners

Abstract: The curse of dimensionality, which has been widely studied in statistics and machine learning, occurs when additional features cause the size of the feature space to grow so quickly that learning classification rules becomes increasingly difficult. How do people overcome the curse of dimensionality when acquiring real‐world categories that have many different features? Here we investigate the possibility that the structure of categories can help. We show that when categories follow a family resemblance structu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
15
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(16 citation statements)
references
References 42 publications
1
15
0
Order By: Relevance
“…Here, the simplest category structures (those with the most concise cognitive representation) are arguably the same as the most informative (those that allow the most accurate reconstruction of the speaker's intended referent): multidimensional convex categories whose members cluster tightly together in similarity space. Supporting this, Vong et al (2019) find that even for stimuli with only four dimensions, categories with family resemblance structure are more learnable than "simpler" categories based on a single dimension. Future work should unpack the extent to which the same structures are simple and informative in different real-world domains.…”
Section: Stimuli and Taskmentioning
confidence: 74%
“…Here, the simplest category structures (those with the most concise cognitive representation) are arguably the same as the most informative (those that allow the most accurate reconstruction of the speaker's intended referent): multidimensional convex categories whose members cluster tightly together in similarity space. Supporting this, Vong et al (2019) find that even for stimuli with only four dimensions, categories with family resemblance structure are more learnable than "simpler" categories based on a single dimension. Future work should unpack the extent to which the same structures are simple and informative in different real-world domains.…”
Section: Stimuli and Taskmentioning
confidence: 74%
“…The specification of the state space significantly impacts the behavior of artificial RL agents. For example, in a large state space, RL performance is limited by what is known as the curse of dimensionality [1,10]: Learning a vast number of state-action values quickly becomes computationally intractable. Defining a smaller state space limited to only task-relevant states is one path toward overcoming this challenge.…”
Section: State Spacementioning
confidence: 99%
“…Despite its tremendous success, there are well known limitations of canonical RL algorithms [10]. Historically, many insights provided by RL research have been demonstrated in relatively simplistic learning tasks, casting doubt on how useful classic RL models are in explaining how humans learn and make choices in everyday life.…”
Section: Introductionmentioning
confidence: 99%
“…While it is plausible that decision-makers encode all relevant stimulus information from the low-dimensional stimuli typically considered in the laboratory, 1 in high-dimensional environments, encoding all available sensory information is inefficient, and can impair learning. This reflects a fundamental computational constraint (known as the curse of dimensionality), which affects both machine-learning algorithms (Hastie, Tibshirani, & Friedman, 2009;Li et al, 2017) and human decision-makers (e.g., Bulgarella & Archer, 1962;Edgell et al, 1996;Pishkin, Bourne, & Fishkin, 1974;Vong et al, 2018).…”
Section: Introductionmentioning
confidence: 99%
“…In the category-learning literature, stimuli with two-four dimensions are common (e.g.,Nosofsky, 1986;Shepard et al, 1961), and stimuli with 16 dimensions are considered to be high-dimensional [e.g.,Vong, Hendrickson, Navarro, and Perfors (2018).…”
mentioning
confidence: 99%