2011
DOI: 10.1353/cpr.2011.0007
|View full text |Cite
|
Sign up to set email alerts
|

From the Ground Up: Building a Participatory Evaluation Model

Abstract: The time and resources of this participatory evaluation process enabled successful navigation of two important issues: (1) increased attention to statewide accountability of collaborative public health initiatives, and (2) increased expectation by health councils or other community partnerships to have a recognized voice in defining measures for this accountability.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
13
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 20 publications
0
13
0
Order By: Relevance
“…We have continued to improve the evaluation process over time, and now use a modified RE-AIM (Reach, Effectiveness, Adoption, Implementation, Maintenance) framework (Glasgow, Klesges, Dzewaltowski, Estabrooks, & Vogt, 2006) to guide our evaluation. A logic model (Fielden et al, 2007; Sanchez, Carrillo, & Wallerstein, 2011; Sandoval et al, 2012; Scarinci, Johnson, Hardy, Marron, & Partridge, 2009) in Figure 1 summarizes the overall plan.

Reach : To evaluate reach of the CES-P, we monitor (a) number/types of participants that inquire about the program (via phone, e-mail, and/or information sessions) and their representative organizations; (b) number/types of participants that apply for the program and their representative organizations; and (c) types of participants that are selected, including organizations, areas of health interest, experience in CBPR, and previous history/ experience of the CBPR partnerships.

Effectiveness : To evaluate effectiveness of the CES-P, we use standardized evaluation tools for each training session (content, expertise of speakers, usefulness, etc.

…”
Section: Program Evaluationmentioning
confidence: 99%
“…We have continued to improve the evaluation process over time, and now use a modified RE-AIM (Reach, Effectiveness, Adoption, Implementation, Maintenance) framework (Glasgow, Klesges, Dzewaltowski, Estabrooks, & Vogt, 2006) to guide our evaluation. A logic model (Fielden et al, 2007; Sanchez, Carrillo, & Wallerstein, 2011; Sandoval et al, 2012; Scarinci, Johnson, Hardy, Marron, & Partridge, 2009) in Figure 1 summarizes the overall plan.

Reach : To evaluate reach of the CES-P, we monitor (a) number/types of participants that inquire about the program (via phone, e-mail, and/or information sessions) and their representative organizations; (b) number/types of participants that apply for the program and their representative organizations; and (c) types of participants that are selected, including organizations, areas of health interest, experience in CBPR, and previous history/ experience of the CBPR partnerships.

Effectiveness : To evaluate effectiveness of the CES-P, we use standardized evaluation tools for each training session (content, expertise of speakers, usefulness, etc.

…”
Section: Program Evaluationmentioning
confidence: 99%
“…Although the utility of CER is perceived as well established in the literature (Campbell & Jovchelovitch, 2007; Israel et al, 1998; Israel, 2005; Minkler & Wallerstein, 2010; Nelson et al, 1998; Wallerstein & Duran, 2006; Wallerstein & Duran, 2010; Zeldin, 2004), measuring and evaluating community engagement in research activities (the extent to which community members are involved with the decisions and activities of the research project) have been limited and have primarily focused on qualitative approaches (Francisco, Paine, & Fawcett, 1993; Goodman et al, 1998; Khodyakov et al, 2013; Lantz, Viruell-Fuentes, Israel, Softley, & Guzman, 2001; McCloskey et al, 2012; Sanchez, Carrillo, & Wallerstein, 2011; Schulz, Israel, & Lantz, 2003). Qualitative methods are effective at assessing community engagement at a project or program level; however, they are time consuming, do not easily scale up for the evaluation of large-scale or multicommunity projects.…”
Section: Introductionmentioning
confidence: 99%
“…11 Others have addressed the formation and evolution of CBPR partnerships, including partnership development, capacity building, strategies for integrating multiculturalism into partnership processes, 12,13 and planning for sustainability. 14 …”
Section: Qualitative Data Related To the Journal’s Vision And Contentmentioning
confidence: 99%