2019
DOI: 10.1177/0165025419877973
|View full text |Cite
|
Sign up to set email alerts
|

The retrospective pretest–posttest design redux: On its validity as an alternative to traditional pretest–posttest measurement

Abstract: We revisit the merits of the retrospective pretest–posttest (RPP) design for repeated-measures research. The underutilized RPP method asks respondents to rate survey items twice during the same posttest measurement occasion from two specific frames of reference: “now” and “then.” Individuals first report their current attitudes or beliefs following a given intervention, and next they are prompted to think back to a specific time prior to the given intervention and rate the item again retrospectively. The desig… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
84
0
6

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 117 publications
(105 citation statements)
references
References 39 publications
0
84
0
6
Order By: Relevance
“…Third, causality cannot be assumed among variables as the study had a cross-sectional design. Even though the retrospective design has been found to be a valuable alternative to traditional pretest-posttest design as it addresses many of its validity concerns and mitigates the response shift bias that seems ubiquitous in the traditional approach, criticisms of the retrospective design include memory-related problems, impression management and response bias (Little et al 2020).…”
Section: Discussionmentioning
confidence: 99%
“…Third, causality cannot be assumed among variables as the study had a cross-sectional design. Even though the retrospective design has been found to be a valuable alternative to traditional pretest-posttest design as it addresses many of its validity concerns and mitigates the response shift bias that seems ubiquitous in the traditional approach, criticisms of the retrospective design include memory-related problems, impression management and response bias (Little et al 2020).…”
Section: Discussionmentioning
confidence: 99%
“…The selection of a youth self-report survey took into consideration existing peer-reviewed literature as well as input from state networks and practitioners regarding outcomes that align with current programmatic goals and outcomes that are relevant across different types of STEM programming. With this in mind, the survey chosen assessed five STEM attitudes (i.e., STEM engagement, career interest and knowledge, activity participant, and identity) and four SEL/twenty-first-century skills (i.e., perseverance, critical thinking, relationships with adults, and relationships with peers) that are associated with success in STEM, both academically and professionally-and in this case, the survey was the Common Instrument Suite for Students (CIS-S, Little et al, 2019;Noam, Allen, Shah, & Triggs, 2017;Sneider & Noam, 2019).…”
Section: Research Significance and Aimsmentioning
confidence: 99%
“…Conceptualization of CIS-S scales STEM engagement. This construct was measured using the Common Instrument, a validated self-report survey of STEM engagement that was developed in partnership with researchers and practitioners in the informal STEM education field (Little et al, 2019;Martinez et al, 2014;Noam et al, 2017;Sneider & Noam, 2019). This scale captures three aspects of engagement: behavioral (e.g., participating or involving oneself in STEM activities or projects), cognitive (e.g., to be drawn to understanding, observing, or figuring out STEM phenomena), and "I like to think of different ways to solve a problem.…”
Section: Measures Common Instrument Suite-student (Cis-s)mentioning
confidence: 99%
See 2 more Smart Citations