2010
DOI: 10.1177/1098214010369170
|View full text |Cite
|
Sign up to set email alerts
|

Insight Into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals

Abstract: To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004)(2005)(2006). The authors chose this time span because it follows the scientifically based research (SBR) movement, which prioritizes the use of randomized controlled trials (RCTs) to study programs and policies. The purpose of this study was to determine the designs and d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
35
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 35 publications
(35 citation statements)
references
References 20 publications
0
35
0
Order By: Relevance
“…As with the policy sciences and implementation sciences discussed earlier, the evaluation field contains a wide range of frameworks, theories, approaches, paradigms, and controversies (266)(267)(268)(269)(270). Given this diversity, an early and pivotal stage in the process is matching the approaches and methods to the purpose of the evaluation.…”
Section: Evaluation Frameworkmentioning
confidence: 99%
“…As with the policy sciences and implementation sciences discussed earlier, the evaluation field contains a wide range of frameworks, theories, approaches, paradigms, and controversies (266)(267)(268)(269)(270). Given this diversity, an early and pivotal stage in the process is matching the approaches and methods to the purpose of the evaluation.…”
Section: Evaluation Frameworkmentioning
confidence: 99%
“…Experimental, quasi‐experimental and non‐experimental study designs were coded for each of the 157 articles. In keeping with the definitions used by Christie and Fleischer, experimental designs were quantitative studies where random assignment of units to conditions occurred, whereas quasi‐experimental designs were quantitative studies involving between‐group comparisons without random assignment or within‐group pre‐ to post‐test comparisons 26 . Non‐experimental designs were those that were neither experimental nor quasi‐experimental, including quantitative cross‐sectional and qualitative studies.…”
Section: Resultsmentioning
confidence: 99%
“…or evaluation imaginary. We also acknowledge that there is much evaluation practice that does not find its way into journals, which is a limitation researchers of other studies on evaluation practice have acknowledged (e.g., Christie & Fleischer, 2010).…”
Section: The Evaluation Theory Treementioning
confidence: 97%
“…been explored in North American journals (Christie & Fleischer, 2010), the purpose of the current study was to explore empirically the similarities and differences between evaluation practice in North America and Europe, as evident within their respective professional communities. Our work was guided by the following research questions: Are there differences or similarities in the understandings of evaluation practice in published evaluation articles in Europe and North America?…”
mentioning
confidence: 99%