Human immunodeficiency virus (HIV) infection has been considered an absolute contraindication to solid-organ transplantation. With immune function restoration possible with highly active antiretroviral therapy (HAART), we evaluated 24 HIV-positive subjects with end-stage liver disease who were undergoing orthotopic liver transplantation (OLTX) after the availability of HAART. The cumulative survival among HIV-positive recipients was similar to that among age- and race-comparable HIV-negative recipients (P=.365, by log-rank test). At 12, 24, and 36 months after OLTX, survival was, respectively, 87.1%, 72.8%, and 72.8% among HIV-positive patients, versus 86.6%, 81.6%, and 77.9% among HIV-negative patients. Survival was poorer among subjects with post-OLTX antiretroviral intolerance (P=.044), a post-OLTX CD4(+) cell count of <200 cells/microL (P=.005), a post-OLTX HIV load of >400 copies/mL (P=.016), and hepatitis C virus infection (P=.023). These findings suggest that survival of HIV-positive liver transplant recipients does not differ from that of HIV-negative liver transplant recipients, and they suggest that HIV infection should no longer be a contraindication to OLTX. Further prospective studies are warranted.
The risks identified for zygomycosis and for disseminated disease, including those that were previously unrecognized, have implications for further elucidating the biologic basis and for optimizing outcomes in SOT recipients with zygomycosis.
We examined the influence of regulatory dendritic cells (DCreg), generated from cytokine-mobilized donor blood monocytes in vitamin D3 and IL-10, on renal allograft survival in a clinically-relevant rhesus macaque model. DCreg expressed low MHC class II and costimulatory molecules, but comparatively high levels of programmed death ligand-1 (B7-H1), and were resistant to pro-inflammatory cytokine-induced maturation. They were infused intravenously (3.5–10×106/kg), together with the B7-CD28 costimulation blocking agent CTLA4Ig, 7 days before renal transplantation. CTLA4Ig was given for up to 8 weeks and rapamycin, started on day −2, was maintained with tapering of blood levels until full withdrawal at 6 months. Median graft survival time was 39.5 days in control monkeys (no DC infusion; n=6) and 113.5 days (p< 0.05) in DCreg-treated animals (n=6). No adverse events were associated with DCreg infusion, and there was no evidence of induction of host sensitization based on circulating donor-specific alloantibody levels. Immunologic monitoring also revealed regulation of donor-reactive memory CD95+ T cells and reduced memory/regulatory T cell ratios in DCreg-treated monkeys compared with controls. Termination allograft histology showed moderate combined T cell- and Ab-mediated rejection in both groups. These findings justify further pre-clinical evaluation of DCreg therapy and their therapeutic potential in organ transplantation.
ObjectiveTo review a single center's experience and outcome with living donor transplants. Summary Background DataOutcome after living donor transplants is better than after cadaver donor transplants. Since the inception of the authors' program, they have performed 2,540 living donor transplants. For the most recent cohort of recipients, improvements in patient care and immunosuppressive protocols have improved outcome. In this review, the authors analyzed outcome in relation to protocol. MethodsThe authors studied patient and graft survival by decade. For those transplanted in the 1990s, the impact of immunosuppressive protocol, donor source, diabetes, and preemptive transplantation was analyzed. The incidence of rejection, posttransplant steroid-related complications, and return to work was determined. Finally, multivariate analysis was used to study risk factors for worse 1-year graft survival and, for those with graft function at 1 year, to study risk factors for worse long-term survival. ResultsFor each decade since 1960, outcome has improved after living donor transplants. Compared with patients transplanted in the 1960s, those transplanted in the 1990s have better 8-year actuarial patient and graft survival rates. Death with function and chronic rejection have continued to be a major cause of graft loss, whereas acute rejection has become a rare cause of graft loss. Cardiovascular deaths have become a more predominant cause of patient death; infection has decreased. Donor source (e.g., ideally HLA-identical sibling) continues to be important. For living donor transplants, rejection and graft survival rates are related to donor source. The authors show that patients who had preemptive transplants or less than 1 year of dialysis have better 5-year graft survival and more frequently return to full-time employment. Readmission and complications remain problems; of patients transplanted in the 1990s, only 36% never required readmission. Similarly, steroid-related complications remain common. The authors' multivariate analysis shows that the major risk factor for worse 1-year graft survival was delayed graft function. For recipients with 1-year graft survival, risk factors for worse long-term outcome were pretransplant smoking, pretransplant peripheral vascular disease, pretransplant dialysis for more than 1 year, one or more acute rejection episodes, and donor age older than 55. ConclusionsThese data show that the outcome of living donor transplants has continued to improve. However, for living donors, donor source affects outcome. The authors also identify other major risk factors affecting both short-and long-term outcome.The first successful kidney transplants in humans were from identical twin living donors.1,2 Although transplanted before the development of chemical immunosuppression, many of these identical twin grafts had long-term survival. With recognition of the immunosuppressive effects of prednisone and azathioprine, the use of nontwin donors became possible.3,4 Considerable controversy soon follow...
A large prospective, open-label, randomized trial evaluated conversion from calcineurin inhibitor (CNI)-to sirolimus (SRL)-based immunosuppression for preservation of renal function in liver transplantation patients. Eligible patients received liver allografts 6-144 months previously and maintenance immunosuppression with CNI (cyclosporine or tacrolimus) since early posttransplantation. In total, 607 patients were randomized (2:1) to abrupt conversion (<24 h) from CNI to SRL (n = 393) or CNI continuation for up to 6 years (n = 214). Between-group changes in baseline-adjusted mean Cockcroft-Gault GFR at month 12 (primary efficacy end point) were not significant. The primary safety end point, noninferiority of cumulative rate of graft loss or death at 12 months, was not met (6.6% vs. 5.6% in the SRL and CNI groups, respectively). Rates of death at 12 months were not significantly different, and no true graft losses (e.g. liver transplantation) were observed during the 12-month period. At 52 weeks, SRL conversion was associated with higher rates of biopsy-confirmed acute rejection (p = 0.02) and discontinuations (p < 0.001), primarily for adverse events. Adverse events were consistent with known safety profiles. In conclusion, liver transplantation patients showed no demonstrable benefit 1 year after conversion from CNI-to SRL-based immunosuppression.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.