Background: Regulators are evaluating the use of non-interventional real-world evidence (RWE) studies to assess the effectiveness of medical products. The RCT-DUPLICATE initiative uses a structured process to design RWE studies emulating randomized controlled trials (RCTs) and compare results. Here, we report findings of the first 10 trial emulations, evaluating cardiovascular outcomes of antidiabetic or antiplatelet medications. Methods: We selected 3 active-controlled and 7 placebo-controlled RCTs for replication. Using patient-level claims data from US commercial and Medicare payers, we implemented inclusion/exclusion criteria, selected primary endpoints, and comparator populations to emulate those of each corresponding RCT. Within the trial-mimicking populations, we conducted propensity score matching to control for >120 pre-exposure confounders. All study parameters were prospectively defined and protocols registered before hazard ratios (HRs) and 95% confidence intervals (CIs) were computed. Success criteria for the primary analysis were pre-specified for each replication. Results: Despite attempts to emulate RCT design as closely as possible, differences between the RCT and corresponding RWE study populations remained. The regulatory conclusions were equivalent in 6 of 10. The RWE emulations achieved a HR estimate that was within the 95% CI from the corresponding RCT in 8 of 10 studies. In 9 of 10, either the regulatory or estimate agreement success criteria were fulfilled. The largest differences in effect estimates were found for RCTs where second-generation sulfonylureas were used as a proxy for placebo regarding cardiovascular effects. Nine of 10 replications had a standardized difference between effect estimates of <2, which suggests differences within expected random variation. Conclusions: Agreement between RCT and RWE findings varies depending on which agreement metric is used. Interim findings indicate that selection of active comparator therapies with similar indications and use patterns enhances the validity of RWE. Even in the context of active comparators, concordance between RCT and RWE findings is not guaranteed, partially because trials are not emulated exactly. More trial emulations are needed to understand how often and in what contexts RWE findings match RCTs. Clinical Trial Registration: URL: https://clinicaltrials.gov Unique Identifiers: NCT03936049, NCT04215523, NCT04215536, NCT03936010, NCT03936036, NCT03936062, NCT03936023, NCT03648424, NCT04237935, NCT04237922
The scientific community and decision-makers are increasingly concerned about transparency and reproducibility of epidemiologic studies using longitudinal healthcare databases. We explored the extent to which published pharmacoepidemiologic studies using commercially available databases could be reproduced by other investigators. We identified a nonsystematic sample of 38 descriptive or comparative safety/effectiveness cohort studies. Seven studies were excluded from reproduction, five because of violation of fundamental design principles, and two because of grossly inadequate reporting. In the remaining studies, >1,000 patient characteristics and measures of association were reproduced with a high degree of accuracy (median differences between original and reproduction <2% and <0.1). An essential component of transparent and reproducible research with healthcare databases is more complete reporting of study implementation. Once reproducibility is achieved, the conversation can be elevated to assess whether suboptimal design choices led to avoidable bias and whether findings are replicable in other data sources.
The US Food and Drug Administration (FDA) is open to accepting real‐world evidence (RWE) to support its assessment of medical products. However, RWE stakeholders lack a shared understanding of FDA’s evidentiary expectations for the use of RWE in applications for new drugs and biologics. We conducted a systematic review of publicly available FDA approval documents from January 2019 to June 2021. We sought to quantify, by year, how many approvals incorporated RWE in any form, and the intended use of RWE in those applications. Among approvals with RWE intended to support safety and/or effectiveness, we classified whether and how those studies impacted FDA’s benefit‐risk considerations, whether those studies were incorporated into the product label, and the therapeutic area of the medical product. Finally, we qualified FDA’s documented feedback where available. We found that 116 approvals incorporated RWE in any form, with the proportion of approvals incorporating RWE increasing each year. Of these approvals, 88 included an RWE study intended to provide evidence of safety or effectiveness. Among these 88 approvals, 65 of the studies influenced FDA’s final decision and 38 were included in product labels. The 88 approvals spanned 18 therapeutic areas. FDA’s feedback on RWE study quality included methodological issues, sample size concerns, omission of patient level data, and other limitations. Based on these findings, we would anticipate that future guidance on FDA’s evidentiary expectations of RWE use will incorporate fit‐for‐purpose real‐world data selection and careful attention to study design and analysis.
ImportanceNonrandomized studies using insurance claims databases can be analyzed to produce real-world evidence on the effectiveness of medical products. Given the lack of baseline randomization and measurement issues, concerns exist about whether such studies produce unbiased treatment effect estimates.ObjectiveTo emulate the design of 30 completed and 2 ongoing randomized clinical trials (RCTs) of medications with database studies using observational analogues of the RCT design parameters (population, intervention, comparator, outcome, time [PICOT]) and to quantify agreement in RCT-database study pairs.Design, Setting, and ParticipantsNew-user cohort studies with propensity score matching using 3 US claims databases (Optum Clinformatics, MarketScan, and Medicare). Inclusion-exclusion criteria for each database study were prespecified to emulate the corresponding RCT. RCTs were explicitly selected based on feasibility, including power, key confounders, and end points more likely to be emulated with real-world data. All 32 protocols were registered on ClinicalTrials.gov before conducting analyses. Emulations were conducted from 2017 through 2022.ExposuresTherapies for multiple clinical conditions were included.Main Outcomes and MeasuresDatabase study emulations focused on the primary outcome of the corresponding RCT. Findings of database studies were compared with RCTs using predefined metrics, including Pearson correlation coefficients and binary metrics based on statistical significance agreement, estimate agreement, and standardized difference.ResultsIn these highly selected RCTs, the overall observed agreement between the RCT and the database emulation results was a Pearson correlation of 0.82 (95% CI, 0.64-0.91), with 75% meeting statistical significance, 66% estimate agreement, and 75% standardized difference agreement. In a post hoc analysis limited to 16 RCTs with closer emulation of trial design and measurements, concordance was higher (Pearson r, 0.93; 95% CI, 0.79-0.97; 94% meeting statistical significance, 88% estimate agreement, 88% standardized difference agreement). Weaker concordance occurred among 16 RCTs for which close emulation of certain design elements that define the research question (PICOT) with data from insurance claims was not possible (Pearson r, 0.53; 95% CI, 0.00-0.83; 56% meeting statistical significance, 50% estimate agreement, 69% standardized difference agreement).Conclusions and RelevanceReal-world evidence studies can reach similar conclusions as RCTs when design and measurements can be closely emulated, but this may be difficult to achieve. Concordance in results varied depending on the agreement metric. Emulation differences, chance, and residual confounding can contribute to divergence in results and are difficult to disentangle.
Background Recent years have witnessed a growing body of observational literature on the association between glucose-lowering treatments and cardiovascular disease. However, many of the studies are based on designs or analyses that inadequately address the methodological challenges involved. Methods We reviewed recent observational literature on the association between glucose-lowering medications and cardiovascular outcomes and assessed the design and analysis methods used, with a focus on their ability to address specific methodological challenges. We describe and illustrate these methodological issues and their impact on observed associations, providing examples from the reviewed literature. We suggest approaches that may be employed to manage these methodological challenges.Results From the evaluation of 81 publications of observational investigations assessing the association between glucose-lowering treatments and cardiovascular outcomes, we identified the following methodological challenges: 1) handling of temporality in administrative databases; 2) handling of risks that vary with time and treatment duration; 3) definitions of the exposure risk window; 4) handling of exposures that change over time; and 5) handling of confounding by indication. Most of these methodological challenges may be suitably addressed through application of appropriate methods. Conclusions/interpretation Observational research plays an increasingly important role in the evaluation of the clinical effects of diabetes treatment. Implementation of appropriate research methods holds the promise of reducing the potential for spurious findings and the risk that the spurious findings will mislead the medical community about risks and benefits of diabetes medications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.