2002
DOI: 10.1016/s1098-2140(02)00200-x
|View full text |Cite
|
Sign up to set email alerts
|

Evaluator Roles and Strategies for Expanding Evaluation Process Influence

Abstract: This article explores various evaluator roles and strategies that have the potential to increase the likelihood that the evaluation process will have an influence on an organization and its members. These roles are (1) educator, (2) consultant, (3) facilitator, and (4) counselor. A brief case study presents the discussion dynamics of an evaluation workgroup. This workgroup resulted in programmatic changes based primarily on the discussion of developing indicators for measuring student outcomes. Practical impli… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0
2

Year Published

2007
2007
2015
2015

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 22 publications
(25 citation statements)
references
References 14 publications
0
23
0
2
Order By: Relevance
“…In recent years, development evaluation shifted from a detached and auditor approach to a collaborative role. Accepting that a development evaluation "should provide information that is credible and useful, enabling the incorporation of lessons learned into the decision-making process of both recipients and donors" (OECD, 2010, p. 22), its capacity to influence donors and recipients, including development practitioners, becomes central in the evaluation practice and theory (Morabito, 2002;Sonnichsen, 2000;Kirkhart, 2000). In this paper, the term influence is used with the meaning of "the capacity of an evaluation process to affect organizational stakeholders and the entity that is being evaluated" (Morabito, 2002, p. 322), merging the managerial and the political connotation of the evaluation activity that becomes "an input from the complex mosaic from which emerge policy decisions and allocation for planning, design, implementation, and continuance of programmes to better human conditions" (Rossi & Freeman, 1993, p. 15).…”
Section: A Critical Friendmentioning
confidence: 99%
“…In recent years, development evaluation shifted from a detached and auditor approach to a collaborative role. Accepting that a development evaluation "should provide information that is credible and useful, enabling the incorporation of lessons learned into the decision-making process of both recipients and donors" (OECD, 2010, p. 22), its capacity to influence donors and recipients, including development practitioners, becomes central in the evaluation practice and theory (Morabito, 2002;Sonnichsen, 2000;Kirkhart, 2000). In this paper, the term influence is used with the meaning of "the capacity of an evaluation process to affect organizational stakeholders and the entity that is being evaluated" (Morabito, 2002, p. 322), merging the managerial and the political connotation of the evaluation activity that becomes "an input from the complex mosaic from which emerge policy decisions and allocation for planning, design, implementation, and continuance of programmes to better human conditions" (Rossi & Freeman, 1993, p. 15).…”
Section: A Critical Friendmentioning
confidence: 99%
“…However, evaluatorconsultants, combining their evaluation expertise, management skills, and institutional memory, can still appropriately examine these phenomena by defining the problem and identifying the correct methodology for addressing the issue. (p. 295) Clifford and Sherman (1983), Morabito (2002), and Owen and Lambert (1998) emphasize the organizational development consultant role of the evaluator, whereas Love (1983b) and Brazil (1999) perceive the evaluator as an advisor or a consultant to program managers. The primary objective for internal evaluator's consultancy is to generate evaluative processes and information that have a positive effect on the organization and its initiatives.…”
Section: Consultantmentioning
confidence: 99%
“…The evaluator should always seek to increase the utility of internal evaluation information for the purpose of advancing organizational learning (Leviton, 2001). The role of contributing to organizational development is also mentioned by Clifford and Sherman (1983), Morabito (2002), and Owen and Lambert (1998).…”
Section: Beyond Being An Evaluatormentioning
confidence: 99%
“…Conceptual and theoretical work on process use (for example, Alkin and Taut, 2003;Fetterman, 2003;Patton, 1997) has begun to spark empirical research (for example, Morabito, 2002;Preskill and Caracelli, 1997;Preskill, Zuckerman, and Matthews, 2003;Taut, 2005;Turnbull, 1998Turnbull, , 1999 that is increasing understanding and use of the concept in informing evaluation practice (Patton, 1998). Our interest in this chapter is twofold: (1) to consider how the construct of process use has been operationalized in empirical research examining process use directly or indirectly and (2) to describe the types of research that have been carried out, with an eye to developing an agenda for ongoing research in this area.…”
Section: Going Through the Process: An Examination Of The Operationalmentioning
confidence: 99%
“…It is also interesting to note that other studies mention process use (Brett, Hill-Mead, and Wu, 2000;Shulha, 2000) but do not explicitly define the term as it related to their study. Morabito (2002) draws on Patton's definition (1997) but calls the concept "process influence," as a result of Kirkhart' s reconceptualization of evaluation use as evaluation influence (2000). Subsequent studies (Russ-Eft, Atwood, and Egherman, 2002;Preskill, Zuckerman, and Matthews, 2003;Kamm, 2004;Taut, 2005) make explicit use of Patton' s definition of process use (1997).…”
Section: Operationalization Of Process Usementioning
confidence: 99%