This study is based on a systematic review of studies using a randomized controlled trial or quasi‐experimental design in order to synthesize existing evidence evaluating the effectiveness of continuing professional development (CPD) interventions in dentistry on learning gains, behavior change, or patient outcomes. The authors searched a range of electronic databases from 1986 to the present and screened all potentially relevant studies for inclusion, using pre‐established inclusion/exclusion criteria. Following data extraction and quality appraisal of all included studies, a narrative synthesis of the studies was undertaken. Ten studies (in fourteen articles) were included. All were evaluation studies of CPD interventions targeted exclusively at dentists. The ten included studies evaluated a range of interventions: courses/workshops, written information, CAL, audit/self‐reflection, face‐to‐face support, and black box combinations of these interventions. Two high‐ and moderately high‐quality studies evaluated CAL CPD for dentists and found equivocal impact of CAL for dentists. A black box combination of interventions was rigorously evaluated and showed moderate impact on patient care. This finding suggests that multimethod and multiphased CPD has potential for the greatest impact. There is a need for more high‐quality randomized controlled trials evaluating CPD interventions in dentistry. It is important that future evaluations of CPD interventions clarify the nature of the interventions such that they are explicit and replicable and that appropriate outcomes are selected (health of patients and change in practice or behavior as well as knowledge and understanding) in order to move the evidence base of effective practice forward in this area of dental education.
Strategies to identify and mitigate publication bias and outcome reporting bias are frequently adopted in systematic reviews of clinical interventions but it is not clear how often these are applied in systematic reviews relating to quantitative health services and delivery research (HSDR). We examined whether these biases are mentioned and/or otherwise assessed in HSDR systematic reviews, and evaluated associating factors to inform future practice. We randomly selected 200 quantitative HSDR systematic reviews published in the English language from 2007-2017 from the Health Systems Evidence database (www.healthsystemsevidence.org). We extracted data on factors that may influence whether or not authors mention and/or assess publication bias or outcome reporting bias. We found that 43% (n = 85) of the reviews mentioned publication bias and 10% (n = 19) formally assessed it. Outcome reporting bias was mentioned and assessed in 17% (n = 34) of all the systematic reviews. Insufficient number of studies, heterogeneity and lack of pre-registered protocols were the most commonly reported impediments to assessing the biases. In multivariable logistic regression models, both mentioning and formal assessment of publication bias were associated with: inclusion of a meta-analysis; being a review of intervention rather than association studies; higher journal impact factor, and; reporting the use of systematic review guidelines. Assessment of outcome reporting bias was associated with: being an intervention review; authors reporting the use of Grading of Recommendations, Assessment, Development and Evaluations (GRADE), and; inclusion of only controlled trials. Publication bias and outcome reporting bias are infrequently assessed in HSDR systematic reviews. This may reflect the inherent heterogeneity of HSDR evidence and different methodological approaches to synthesising the evidence, lack of awareness of such biases, limits of current tools and lack of pre-registered study protocols for assessing such biases. Strategies to help raise awareness of the biases, and methods to
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.