2013
DOI: 10.1080/02678373.2013.812358
|View full text |Cite
|
Sign up to set email alerts
|

Organizational interventions: A research-based framework for the evaluation of both process and effects

Abstract: Organizational interventions are often recommended when organizations want to improve employee psychological health and well-being. Research, however, has revealed inconsistent results and reviewers have called for research on why interventions either bring about desired change or fail to do so. Answering the “how” and “why” of intervention outcomes requires a close examination of the elements that hinder or facilitate desired outcomes, thus moving beyond evaluation of only the overall effects. In this paper, … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
292
0
7

Year Published

2014
2014
2021
2021

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 241 publications
(301 citation statements)
references
References 55 publications
2
292
0
7
Order By: Relevance
“…If this is the case, the variable should be regarded as an intermediate effect and measured as part of the effect evaluation. A systematic way to take intermediate effects into account is to formulate a program theory (10). A program theory states under which conditions researchers expect proximal changes to occur (30) but seems to be missing in many of the included studies.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…If this is the case, the variable should be regarded as an intermediate effect and measured as part of the effect evaluation. A systematic way to take intermediate effects into account is to formulate a program theory (10). A program theory states under which conditions researchers expect proximal changes to occur (30) but seems to be missing in many of the included studies.…”
Section: Discussionmentioning
confidence: 99%
“…This can be done by studying process variables (9)(10)(11)(12). There are different ways of investigating the implementation process.…”
mentioning
confidence: 99%
“…For example, managers in a call center who want to implement more enriched work design as a means of gaining a unique competitive advantage in mass markets (Batt, 2002) might experience much resistance from colleagues who believe that it will simply increase costs. In such circumstances, successful implementation of a new work design might require managers to spend more time persuading others as to the value of their ideas and developing a broad base of support (Nielsen & Abildgaard, 2013).…”
Section: Implications For Practicementioning
confidence: 99%
“…As the need to analyze the intervention from the perspective of isomorphism arose long after the interventions had ended, we opted for a retrospective reflection and evaluations of the process. Methodologically, our research is situated in studies of evaluation of organizational interventions (Nielsen & Abildgaard, 2013). As the analysis of our own conduct is by no means unproblematic, we searched for a methodological approach that could provide necessary analytical distance, and be something more conceptual than simple reflection on the interventionists' role typical to action research (Stringer, 2014).…”
Section: Methodsology and Research Materials: Analysis Of Program Mechamentioning
confidence: 99%
“…Realistic evaluation research can focus on organizational actors, intervention design, context, process, or outcomes (Nielsen & Abildgaard, 2013;Soininen, 2013). In this article, we scrutinize the intervention design and its implementation.…”
Section: Research-article2017mentioning
confidence: 99%