BackgroundIn meta-analysis, the presence of funnel plot asymmetry is attributed to publication or other small-study effects, which causes larger effects to be observed in the smaller studies. This issue potentially mean inappropriate conclusions are drawn from a meta-analysis. If meta-analysis is to be used to inform decision-making, a reliable way to adjust pooled estimates for potential funnel plot asymmetry is required.MethodsA comprehensive simulation study is presented to assess the performance of different adjustment methods including the novel application of several regression-based methods (which are commonly applied to detect publication bias rather than adjust for it) and the popular Trim & Fill algorithm. Meta-analyses with binary outcomes, analysed on the log odds ratio scale, were simulated by considering scenarios with and without i) publication bias and; ii) heterogeneity. Publication bias was induced through two underlying mechanisms assuming the probability of publication depends on i) the study effect size; or ii) the p-value.ResultsThe performance of all methods tended to worsen as unexplained heterogeneity increased and the number of studies in the meta-analysis decreased. Applying the methods conditional on an initial test for the presence of funnel plot asymmetry generally provided poorer performance than the unconditional use of the adjustment method. Several of the regression based methods consistently outperformed the Trim & Fill estimators.ConclusionRegression-based adjustments for publication bias and other small study effects are easy to conduct and outperformed more established methods over a wide range of simulation scenarios.
Recently, health systems internationally have begun to use cost-effectiveness research as formal inputs into decisions about which interventions and programmes should be funded from collective resources. This process has raised some important methodological questions for this area of research. This paper considers one set of issues related to the synthesis of effectiveness evidence for use in decision-analytic cost-effectiveness (CE) models, namely the need for the synthesis of all sources of available evidence, although these may not 'fit neatly' into a CE model. Commonly encountered problems include the absence of head-to-head trial evidence comparing all options under comparison, the presence of multiple endpoints from trials and different follow-up periods. Full evidence synthesis for CE analysis also needs to consider treatment effects between patient subpopulations and the use of nonrandomised evidence. Bayesian statistical methods represent a valuable set of analytical tools to utilise indirect evidence and can make a powerful contribution to the decision-analytic approach to CE analysis. This paper provides a worked example and a general overview of these methods with particular emphasis on their use in economic evaluation.
Mixed treatment comparison models extend meta-analysis methods to enable comparisons to be made between all relevant comparators in the clinical area of interest. In such modelling it is imperative that potential sources of variability are explored to explain both heterogeneity (variation in treatment effects between trials within pairwise contrasts) and inconsistency (variation in treatment effects between pairwise contrasts) to ensure the validity of the analysis.The objective of this paper is to extend the mixed treatment comparison framework to allow for the incorporation of study-level covariates in an attempt to explain between-study heterogeneity and reduce inconsistency. Three possible model specifications assuming different assumptions are described and applied to a 17-treatment network for stroke prevention treatments in individuals with non-rheumatic atrial fibrillation.The paper demonstrates the feasibility of incorporating covariates within a mixed treatment comparison framework and using model fit statistics to choose between alternative model specifications. Although such an approach may adjust for inconsistencies in networks, as for standard meta-regression, the analysis will suffer from low power if the number of trials is small compared with the number of treatment comparators.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.