2019
DOI: 10.31234/osf.io/4mvyh
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

ManyClasses 1: Assessing the generalizable effect of immediate versus delayed feedback across many college classes

Abstract: Psychology researchers have long attempted to identify educational practices that improve student learning. However, experimental research on these practices is often conducted in laboratory contexts or in a single class, threatening the external validity of the results. In this paper, we establish an experimental paradigm for evaluating the benefits of recommended practices across a variety of authentic educational contexts – a model we call ManyClasses. The core feature is that researchers examine the same r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
8
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
3
2

Relationship

3
2

Authors

Journals

citations
Cited by 5 publications
(8 citation statements)
references
References 25 publications
0
8
0
Order By: Relevance
“…In the case of applied research, we think researchers could be guided by the natural variation that occurs in actual practice. This was the strategy for our ManyClasses study (Fyfe et al, 2019). We examined how the timing of feedback on student work affected learning performance in 38 different college classes.…”
Section: Main Textmentioning
confidence: 99%
“…In the case of applied research, we think researchers could be guided by the natural variation that occurs in actual practice. This was the strategy for our ManyClasses study (Fyfe et al, 2019). We examined how the timing of feedback on student work affected learning performance in 38 different college classes.…”
Section: Main Textmentioning
confidence: 99%
“…Multiteam collaborations may sound similar to traditional applications of meta-analysis, a tool to aggregate an existing body of work. Multiteam collaborations often make use of meta-analytic methods (e.g., weighted averaging of effect sizes, aggregating results of multiple studies), but because traditional meta-analyses include studies conducted across time, often by researchers applying different interventions with differing applications or protocols, and strongly affected by publication bias, the ability of meta-analysis to identify the true effect is limited (Fyfe, de Leeuw, Carvalho, Goldstone, & Motz, 2019). Traditional meta-analyses can rarely answer the question of how much of the variability in results or in replicability can be attributed to sample, researcher, statistical power, or bias (e.g., Lakens, Hilgard, & Staaks, 2016).…”
Section: Collaborative Researchmentioning
confidence: 99%
“…By having multiple research sites and collaborators, a single multiteam “study” can reveal what factors influence the effect better than a meta-analysis of independently conducted existing studies. Fyfe et al (2019) provide a relevant education example where the authors are using a multiteam approach to test the effect of timing of instructor feedback on subsequent class performance. Instead of using a single class with the same curriculum, the authors solicited a wide range of classes and disciplines.…”
Section: Collaborative Researchmentioning
confidence: 99%
See 1 more Smart Citation
“…To that end, we created an osf webpage that documents the IRB protocol, the analysis plan, and instructor resources (https://osf.io/q84t7/). Further, the manuscript, which outlines the method and data collection details, was submitted as a registered report (Fyfe et al, 2019). Open communication is also a key priority.…”
Section: Large-scale Collaborative Science: the Manyclasses Approach mentioning
confidence: 99%