Many interventions that are delivered within public health services have little evidence of effect. Evaluating interventions that are being delivered as a part of usual practice offers opportunities to improve the evidence base of public health. However, such evaluation is challenging and requires the integration of research into system-wide practice. The Born in Bradford’s Better Start experimental birth cohort offers an opportunity to efficiently evaluate multiple complex community interventions to improve the health, wellbeing and development of children aged 0–3 years. Based on the learning from this programme, this paper offers a pragmatic and practical guide to researchers, public health commissioners and service providers to enable them to integrate research into their everyday practice, thus enabling relevant and robust evaluations within a complex and changing system. Using the principles of co-production the key challenges of integrating research and practice were identified, and appropriate strategies to overcome these, developed across five key stages: 1) Community and stakeholder engagement; 2) Intervention design; 3) Optimising routinely collected data; 4) Monitoring implementation; and 5) Evaluation. As a result of our learning we have developed comprehensive toolkits ( https://borninbradford.nhs.uk/what-we-do/pregnancy-early-years/toolkit/ ) including: an operational guide through the service design process; an implementation and monitoring guide; and an evaluation framework. The evaluation framework incorporates implementation evaluations to enable understanding of intervention performance in practice, and quasi experimental approaches to infer causal effects in a timely manner. We also offer strategies to harness routinely collected data to enhance the efficiency and affordability of evaluations that are directly relevant to policy and practice. These strategies and tools will help researchers, commissioners and service providers to work together to evaluate interventions delivered in real-life settings. More importantly, however, we hope that they will support the development of a connected system that empowers practitioners and commissioners to embed innovation and improvement into their own practice, thus enabling them to learn, evaluate and improve their own services.
BackgroundProblems with oral language skills in childhood have been linked with poor educational, employment, and mental health outcomes. In the UK, there is increasing concern about the oral language skills of children, particularly children from areas of social disadvantage. Research emphasises the importance of the home language environment as a fundamental bedrock for the development of oral language skills. It is vital, therefore, that support is available to help families in need to provide the optimal language environment for their child. Talking Together is a 6-week home visiting programme recently commissioned by Better Start Bradford to develop parents’ knowledge of the importance of a good language environment and help to improve parent-child interactions. This study represents the initial steps in developing a definitive trial of the Talking Together programme.MethodThis study is a two-arm randomised controlled feasibility study in which families referred into the Talking Together programme and consent to participate in the trial will be randomly allocated to either an intervention group or a waiting control group. We will assess the recruitment and retention rates, the representativeness of our sample, the appropriateness of our measures, and the sample size needed for a definitive trial. We will also carry out a qualitative evaluation to explore the acceptability of trial procedures for families and service providers, fidelity of delivery, time and resources for training, and barriers and facilitators to engagement with the programme. Clear progression criteria will be used to assess suitability for a definitive trial.ConclusionThis feasibility study will inform the development of a definitive trial of this home-based visiting programme, which will add to the sparse evidence base on which practitioners can draw when supporting families in need. The lessons learnt from this feasibility study will also inform the wider evaluation work of the Better Start Bradford Innovation Hub.Trial registrationThe trial is registered with the ISRCTN registry: study ID ISRCTN13251954. Date of registration: 21 February 2019 (the trial was retrospectively registered).
IntroductionImplementation evaluations are integral to understanding whether, how and why interventions work. However, unpicking the mechanisms of complex interventions is often challenging in usual service settings where multiple services are delivered concurrently. Furthermore, many locally developed and/or adapted interventions have not undergone any evaluation, thus limiting the evidence base available. Born in Bradford’s Better Start cohort is evaluating the impact of multiple early life interventions being delivered as part of the Big Lottery Fund’s ‘A Better Start’ programme to improve the health and well-being of children living in one of the most socially and ethnically diverse areas of the UK. In this paper, we outline our evaluation framework and protocol for embedding pragmatic implementation evaluation across multiple early years interventions and services.Methods and analysisThe evaluation framework is based on a modified version of The Conceptual Framework for Implementation Fidelity. Using qualitative and quantitative methods, our evaluation framework incorporates semistructured interviews, focus groups, routinely collected data and questionnaires. We will explore factors related to content, delivery and reach of interventions at both individual and wider community levels. Potential moderating factors impacting intervention success such as participants’ satisfaction, strategies to facilitate implementation, quality of delivery and context will also be examined. Interview and focus guides will be based on the Theoretical Domains Framework to further explore the barriers and facilitators of implementation. Descriptive statistics will be employed to analyse the routinely collected quantitative data and thematic analysis will be used to analyse qualitative data.Ethics and disseminationThe Health Research Authority (HRA) has confirmed our implementation evaluations do not require review by an NHS Research Ethics Committee (HRA decision 60/88/81). Findings will be shared widely to aid commissioning decisions and will also be disseminated through peer-reviewed journals, summary reports, conferences and community newsletters.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.