2018
DOI: 10.1177/0163278718759396
|View full text |Cite
|
Sign up to set email alerts
|

Innovations for Evaluation Research: Multiform Protocols, Visual Analog Scaling, and the Retrospective Pretest–Posttest Design

Abstract: In this article, we review three innovative methods: multiform protocols, visual analog scaling, and the retrospective pretest-posttest design that can be used in evaluation research. These three techniques have been proposed for decades, but unfortunately, they are still not utilized readily in evaluation research. Our goal is to familiarize researchers with these underutilized research techniques that could reduce personnel effort and costs for data collection while producing better inferences for a study. W… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
23
0
2

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 51 publications
(26 citation statements)
references
References 45 publications
1
23
0
2
Order By: Relevance
“…Moreover, retrospective designs address concerns regarding response shift bias (i.e. when respondents' personal understanding of a construct changes over time) (Chang and Little 2018;Sibthorp et al 2007), which has been alternately described as a situation in which respondents overstate their knowledge, attitudes or behaviors because they 'don't know what [they] don't know' (Browne 2018, 2). Participants may have limited knowledge or experience to accurately assess their attitudes or behaviors prior to an experience.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, retrospective designs address concerns regarding response shift bias (i.e. when respondents' personal understanding of a construct changes over time) (Chang and Little 2018;Sibthorp et al 2007), which has been alternately described as a situation in which respondents overstate their knowledge, attitudes or behaviors because they 'don't know what [they] don't know' (Browne 2018, 2). Participants may have limited knowledge or experience to accurately assess their attitudes or behaviors prior to an experience.…”
Section: Discussionmentioning
confidence: 99%
“…For two factors ('Environmental attitudes' and 'Self-efficacy') we developed a retrospective pre-post bank of items in which participants were asked to think back to before the program to indicate their level of agreement with items before participating and then to indicate their current level of agreement after their participation. These retrospective pre-post items were developed to enhance variation and sensitivity measuring changes in these attitudes (Chang and Little 2018;Sibthorp et al 2007). All items were anchored using 11-point scales, measured as 0 to 10 with three anchors at the low end, the midpoint and the high end of the scale (see Table 1).…”
Section: Scale Development Processmentioning
confidence: 99%
“…The scales consisted of 10 centimeters lines along with verbal anchors at both ends along which subjects marked their responses. 21 The gender identity scale had verbal anchors of "female" and "male;" for the analyses, the midpoint of the line was assigned a value of 0, representing "gender neutral;" subjects could also choose to identify as "gender fluid" in lieu of or along with completing the VAS.…”
Section: Methodsmentioning
confidence: 99%
“…participates in a discussion" and then asked to share their observation of their child's ability level on a 1 (strongly disagree) to 5 (strongly agree) Likert-style scale, rating their child both before and after camp. Retrospective pretest (RPT) designs have been identified as helpful for reducing concerns associated with incomplete data (Raidl et al, 2004), financial and implementation challenges associated with administering multiple measures (Chang & Little, 2018), and response shift bias (i.e., when respondents' personal understanding of a construct may change over time; Chang & Little, 2018;Sibthorp, Paisley, Gookin, & Ward, 2007), which has been alternately described as a situation in which respondents "don't know what [they] don't know" (Browne, 2018, para. 2).…”
Section: Gagnon and Garstmentioning
confidence: 99%
“…One potential limitation of this study is embedded in the design of the PPDO. Although the rationale for and strength of RPT designs have been established (Allen & Nimon, 2007;Raidl et al, 2004;Sibthorp et al, 2007), there are also weaknesses inherent in such designs, including the inaccuracy of recall (Pratt, McGuigan, & Katzev, 2000), the tendency for study participants to respond in socially desirable ways (e.g., learning effects; Marshall et al, 2007), the propensity of participants to inflate program ratings to make the program look good (Pratt et al, 2000), and the fact that RPT feedback is collected only from participants who completed a program, which fails to capture the perspective of those who started but did not finish a program (Chang & Little, 2018). While some suggest a limitation of the RPT design is the desire to demonstrate a learning effect, the relatively low effect sizes (Table 4), evidence of discriminant validity (Table 3), and construct stability across the five dimensions of the PPDO illustrates the effectiveness of RPT as a method in a residential camp context, when using the PPDO.…”
Section: Gagnon and Garstmentioning
confidence: 99%