2022
DOI: 10.1037/xge0001088
|View full text |Cite
|
Sign up to set email alerts
|

Auditory and visual category learning in musicians and nonmusicians.

Abstract: Across three experiments, we compare the ability of amateur musicians and nonmusicians in learning artificial auditory and visual categories that can be described as either rulebased (RB) or information-integration (II) category structures. RB categories are optimally learned using a reflective reasoning process, whereas II categories are optimally learned by integrating information from two stimulus dimensions at a reflexive, pre-decisional processing stage. We found that musicians have selective advantages f… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
8
0
1

Year Published

2022
2022
2023
2023

Publication Types

Select...
6
2

Relationship

5
3

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 59 publications
1
8
0
1
Order By: Relevance
“…Prior work has also demonstrated that children have enhanced learning for auditory over visual information (Budoff & Quinlan, 1964), whereas adults have enhanced learning for visual over auditory information (Roark, Smayda, et al, 2021;Sloutsky & Napolitano, 2003). We found that children do not have modality bias for category learning and performed similarly across modalities.…”
Section: Asymmetric Development Across Categories and Modalitiessupporting
confidence: 61%
“…Prior work has also demonstrated that children have enhanced learning for auditory over visual information (Budoff & Quinlan, 1964), whereas adults have enhanced learning for visual over auditory information (Roark, Smayda, et al, 2021;Sloutsky & Napolitano, 2003). We found that children do not have modality bias for category learning and performed similarly across modalities.…”
Section: Asymmetric Development Across Categories and Modalitiessupporting
confidence: 61%
“…A power analysis (calculated using the pwr package in the R programming environment; Version 3.6.1.; R Core Team, 2019; see Champely, 2020) indicated that a sample of 21 participants per group would be needed to obtain statistical power at a 0.80 level (α = .05) to detect a small-to-medium difference among conditions ( d = 0.37 or f = 0.185). The effect size was estimated from the smallest between-groups difference from a recent study of auditory categorization (Roark et al, 2022).…”
Section: Methodsmentioning
confidence: 99%
“…A power analysis (calculated using the pwr package in R; Champely, 2020) indicated that a sample of 21 participants per group would be needed to obtain statistical power at a 0.80 level (alpha= .05) to detect a small-medium difference among conditions (d = 0.37 or f = 0.185). The effect size was estimated from the smallest between-group difference from a recent study of auditory categorization (Roark, Smayda, et al, 2021).…”
Section: Methodsmentioning
confidence: 99%
“…The tasks were similar to those used in previous studies that examined RB and II category learning in the auditory domain (Roark, Smayda, et al, 2021). Each participant completed both RB and II tasks, with the order counterbalanced across participants.…”
Section: Rb and II Category Learning Tasksmentioning
confidence: 99%
See 1 more Smart Citation