This is a repository copy of Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH) : a stepped-wedge cluster-randomised trial. Effectiveness of a national quality improvement programme to improve survival after emergency abdominal surgery (EPOCH) : a stepped-wedge cluster-randomised trial. The Lancet. ISSN 0140-6736 https://doi.org/10.1016/S0140-6736(18)32521-2 eprints@whiterose.ac.uk https://eprints.whiterose.ac.uk/ ReuseThis article is distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs (CC BY-NC-ND) licence. This licence only allows you to download this work and share it with others as long as you credit the authors, but you can't change the article in any way or use it commercially. More information and the full terms of the licence here: https://creativecommons.org/licenses/ Implications of all the available evidenceDespite the success of some smaller projects, there was no survival benefit from a national quality improvement programme to implement a care pathway for patients undergoing emergency abdominal surgery. To succeed, large national quality improvement programmes need to allow for differences between hospitals and ensure teams have both the time and resources needed to improve patient care.
Thrombotic complications are more common in liver disease than might be expected because of the coagulopathy described by conventional coagulation tests. Some of these complications may be life-threatening. The phenomenon of hypercoagulation is associated with complications in many populations, but the incidence in liver transplant recipients is unclear. We performed a retrospective database review of intraoperative thromboelastography (TEG) for 124 liver transplant recipients. We assessed the prevalence of hypercoagulation in this group and investigated the relative frequency of both shortened TEG reaction times (R times) and increased net clot strength (G) values. These findings were correlated with thrombotic complications. At the baseline, the prevalence of high G values was 15.53% on native TEG, and the prevalence of shortened R times was 6.80% on native-heparinase TEG. Patients with cholestatic pathologies had particularly high rates of hypercoagulation (42.9% with primary biliary cirrhosis and 85.7% with primary sclerosing cholangitis), but hypercoagulation was also common in patients with fulminant hepatic failure (50%) and nonalcoholic steatohepatitis (37.5%). There was a poor correlation between the TEG R time and the international normalized (INR), with 37.7% of TEG analyses demonstrating a short R time with an INR > 2. Six of the patients developed early hepatic artery thrombosis (5%); 3 of these patients had TEG evidence of high G values (P 5 0.25), and 4 had short R times (not significant). In conclusion, intraoperative TEG evidence of high G values and short R times is relatively common in liver transplantation. It is unclear what bearing this condition has on thrombotic complications. Conventional coagulation tests have no ability to diagnose this condition. It is conceivable that such patients may come to harm if hypercoagulability is unrecognized and, therefore, inappropriately managed.
SummaryThe international normalised ratio is frequently raised in patients who have undergone major liver resection, and is assumed to represent a potential bleeding risk. However, these patients have an increased risk of venous thromboembolic events, despite conventional coagulation tests indicating hypocoagulability. This prospective, observational study of patients undergoing major hepatic resection analysed the serial changes in coagulation in the early postoperative period. Thrombin generation parameters and viscoelastic tests of coagulation (thromboelastometry) remained within normal ranges throughout the study period. Levels of the procoagulant factors II, V, VII and X initially fell, but V and X returned to or exceeded normal range by postoperative day five. Levels of factor VIII and Von Willebrand factor were significantly elevated from postoperative day one (p < 0.01). Levels of the anticoagulants, protein C and antithrombin remained significantly depressed on postoperative day five (p = 0.01). Overall, the imbalance between pro-and anticoagulant factors suggested a prothrombotic environment in the early postoperative period.
IMPORTANCEIntravenous iron is recommended by many clinical guidelines based largely on its effectiveness in reducing anemia. However, the association with important safety outcomes, such as infection, remains uncertain. OBJECTIVE To examine the risk of infection associated with intravenous iron compared with oral iron or no iron. DATA SOURCES Medline, Embase, and Cochrane Central Register of Controlled Trials (CENTRAL) were searched for randomized clinical trials (RCTs) from 1966 to January 31, 2021. Ongoing trials were sought from ClinicalTrials.gov, CENTRAL, and the World Health Organization International Clinical Trials Search Registry Platform. STUDY SELECTION Pairs of reviewers identified RCTs that compared intravenous iron with oral iron or no iron across all patient populations, excluding healthy volunteers. Nonrandomized studies published since January 1, 2007, were also included. A total of 312 full-text articles were assessed for eligibility. DATA EXTRACTION AND SYNTHESIS Data extraction and risk of bias assessments were performed according to the Preferred Reporting Items of Systematic Reviews and Meta-analyses (PRISMA) and Cochrane recommendations, and the quality of evidence was assessed using the GRADE (Grades of Recommendation, Assessment, Development, and Evaluation) approach. Two reviewers extracted data independently. A random-effects model was used to synthesize data from RCTs. A narrative synthesis was performed to characterize the reporting of infection. MAIN OUTCOMES AND MEASURESThe primary outcome was risk of infection. Secondary outcomes included mortality, hospital length of stay, and changes in hemoglobin and red blood cell transfusion requirements. Measures of association were reported as risk ratios (RRs) or mean differences.RESULTS A total of 154 RCTs (32 920 participants) were included in the main analysis. Intravenous iron was associated with an increased risk of infection when compared with oral iron or no iron (RR, 1.17; 95% CI, 1.04-1.31; I 2 = 37%; moderate certainty of evidence). Intravenous iron also was associated with an increase in hemoglobin (mean difference, 0.57 g/dL; 95% CI, 0.50-0.64 g/dL; I 2 = 94%) and a reduction in the risk of requiring a red blood cell transfusion (RR, 0.93; 95% CI, 0.76-0.89; I 2 = 15%) when compared with oral iron or no iron. There was no evidence of an effect on mortality or hospital length of stay. (continued) Key Points Question In patients who require treatment with intravenous iron, what is the evidence that this intervention increases the risk of developing a new infection? Findings In this systematic review and meta-analysis of 154 randomized clinical trials that included 32 920 participants, intravenous iron was associated with an increased risk of infection. Meaning The results of this study suggest that, despite broad advocacy in clinical guidelines, intravenous iron may increase the risk of infection, which must be balanced with the potential benefits of treating anemia and reducing blood transfusion requirements.
The aims of this study were to determine whether the withdrawal of aprotinin (APRO) led to an increased bleeding risk in patients undergoing orthotopic liver transplantation (OLT). A retrospective analysis compared consecutive patients undergoing OLT and treated with aprotinin (APRO group; n 5 100) with a group in which aprotinin was not used (no-APRO group; n 5 100). Propensity score matching was then performed for each group to identify 2 matched cohorts. Patients were matched by their primary diagnoses and Model for End-Stage Liver Disease scores. This resulted in 2 matched cohorts with 55 patients in each group. None of the patients in the APRO group had significant fibrinolysis. In the no-APRO group, 23.6% of the patients developed fibrinolysis (P < 0.003). Tranexamic acid was used in 61.5% of the patients (n 5 8) in the no-APRO group in whom lysis was present, and this resolved the fibrinolysis in all but 1 of these patients. There were no differences in red blood cell, fresh frozen plasma, platelet concentrate, or cryoprecipitate transfusions between the 2 groups. In conclusion, we have shown a significant increase in the prevalence of fibrinolysis during OLT since the withdrawal of APRO. However, there has been no increase in transfusion requirements. Liver Transpl 20:584-590, 2014. V C 2014 AASLD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.