Conventional thinking about preventive interventions focuses over simplistically on the "package" of activities and/or their educational messages. An alternative is to focus on the dynamic properties of the context into which the intervention is introduced. Schools, communities and worksites can be thought of as complex ecological systems. They can be theorised on three dimensions: (1) their constituent activity settings (e.g., clubs, festivals, assemblies, classrooms); (2) the social networks that connect the people and the settings; and (3) time. An intervention may then be seen as a critical event in the history of a system, leading to the evolution of new structures of interaction and new shared meanings. Interventions impact on evolving networks of person-time-place interaction, changing relationships, displacing existing activities and redistributing and transforming resources. This alternative view has significant implications for how interventions should be evaluated and how they could be made more effective. We explore this idea, drawing on social network analysis and complex systems theory.
Insignificant or modest findings in intervention trials may be attributable to poorly designed or theorised interventions, poorly implemented interventions, or inadequate evaluation methods. The pre-existing context may also account for the effects observed. A combination of qualitative and quantitative methods is outlined that will permit the determination of how context level factors might modify intervention effectiveness, within a cluster randomised community intervention trial to promote the health of mothers with new babies. The methods include written and oral narratives, key informant interviews, impact logs, and inter-organisational network analyses. Context level factors, which may affect intervention uptake, success, and sustainability are the density of inter-organisational ties within communities at the start of the intervention, the centrality of the primary care agencies expected to take a lead with the intervention, the extent of context-level adaptation of the intervention, and the amount of local resources contributed by the participating agencies. Investigation of how intervention effects are modified by context is a new methodological frontier in community intervention trial research.
Research interest in the analysis of stories has increased as researchers in many disciplines endeavor to see the world through the eyes of others. We make the methodological case for narrative inquiry as a unique means to get inside the world of health promotion practice. We demonstrate how this form of inquiry may reveal what practitioners value most in and through their practice, and the indigenous theory or the cause-and-consequence thinking that governs their actions. Our examples draw on a unique data set, i.e. 2 two years' of diaries being kept by community development officers in eight communities engaged in a primary care and community development intervention to reduce postnatal depression and promote the physical health of recent mothers. Narrative inquiry examines the way a story is told by considering the positioning of the actor/storyteller, the endpoints, the supporting cast, the sequencing and the tension created by the revelation of some events, in preference to others. Narrative methods may provide special insights into the complexity of community intervention implementation over and above more familiar research methods.
This paper presents issues which arose in the conduct of qualitative evaluation research within a cluster-randomized, community-level, preventive intervention trial. The research involved the collection of narratives of practice regarding the intervention by community development officers working in eight communities over a two-year period. The community development officers were largely responsible for implementing the intervention. We discuss the challenges associated with the collection of data as the intervention unfolded, in particular, the disputes over cues to revise and adjust the intervention (i.e. to use the early data formatively). We explore the ethical uncertainties that arise when multiple parties have different views on the legitimacy of types of knowledge and the appropriate role of research and theory in various trial stages. These issues are discussed drawing on the fields of ethnography, community psychology, epidemiology, qualitative methodology and notions of research reflexivity. We conclude by arguing that, in addition to the usual practice of having an outcome data-monitoring committee, community intervention trials also require a process data-monitoring committee as a forum for debate and decision-making. Without such a forum, the relevance, ethics and position of qualitative evaluation research within randomized controlled trials are destined to be a point of contention rather than a source of insight.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.