A fundamental step in the successful management of sepsis and septic shock is early empiric antimicrobial therapy. However, for this to be effective, several decisions must be addressed simultaneously: (1) antimicrobial choices should be adequate, covering the most probable pathogens; (2) they should be administered in the appropriate dose, (3) by the correct route, and (4) using the correct mode of administration to achieve successful concentration at the infection site. In critically ill patients, antimicrobial dosing is a common challenge and a frequent source of errors, since these patients present deranged pharmacokinetics, namely increased volume of distribution and altered drug clearance, which either increased or decreased. Moreover, the clinical condition of these patients changes markedly over time, either improving or deteriorating. The consequent impact on drug pharmacokinetics further complicates the selection of correct drug schedules and dosing during the course of therapy. In recent years, the knowledge of pharmacokinetics and pharmacodynamics, drug dosing, therapeutic drug monitoring, and antimicrobial resistance in the critically ill patients has greatly improved, fostering strategies to optimize therapeutic efficacy and to reduce toxicity and adverse events. Nonetheless, delivering adequate and appropriate antimicrobial therapy is still a challenge, since pathogen resistance continues to rise, and new therapeutic agents remain scarce. We aim to review the available literature to assess the challenges, impact, and tools to optimize individualization of antimicrobial dosing to maximize exposure and effectiveness in critically ill patients.
The SARS-CoV-2 pandemic has placed great strain on the most developed of health care systems, especially in the context of critical care. Although co-infections with cytomegalovirus (CMV) are frequent in the critically ill due to underlying immune suppression of multiple causes, the impact on COVID-19 patients remains unclear. Furthermore, severe COVID-19 has recently been associated with significant immune suppression, and this may in turn impact CMV reactivation, possibly contributing to clinical course. Nevertheless, multiple confounding factors in these patients will certainly challenge upcoming research. The authors present a case series of five patients admitted to the intensive care unit (ICU) in the context of respiratory failure due to severe COVID-19. All patients evolved with CMV reactivation during ICU stay.
The high prevalence of infectious diseases in the intensive care unit (ICU) and consequently elevated pressure for immediate and effective treatment have led to increased antimicrobial therapy consumption and misuse. Moreover, the emerging global threat of antimicrobial resistance and lack of novel antimicrobials justify the implementation of judicious antimicrobial stewardship programs (ASP) in the ICU. However, even though the importance of ASP is generally accepted, its implementation in the ICU is far from optimal and current evidence regarding strategies such as de-escalation remains controversial. The limitations of clinical guidance for antimicrobial therapy initiation and discontinuation have led to multiple studies for the evaluation of more objective tools, such as biomarkers as adjuncts for ASP. C-reactive protein and procalcitonin can be adequate for clinical use in acute infectious diseases, the latter being the most studied for ASP purposes. Although promising, current evidence highlights challenges in biomarker application and interpretation. Furthermore, the physiological alterations in the critically ill render pharmacokinetics and pharmacodynamics crucial parameters for adequate antimicrobial therapy use. Individual pharmacokinetic and pharmacodynamic targets can reduce antimicrobial therapy misuse and risk of antimicrobial resistance.
Anemia and iron deficiency were highly prevalent in internal medicine patients. As anemia negatively impacts on in-hospital mortality, awareness should be raised for effective diagnosis and management of these comorbidities in hospitalized patients.
IntroductionFirst antiretroviral therapy (ART) is often switched to simpler, more potent or better tolerated regimens [1, 2]. Although discontinuation rates are frequently studied, the durability of regimens is rarely approached.Materials and MethodsRetrospective study with the following objectives: analyze first ART schemes and their durability in naive patients with chronic HIV-1 and 2 infections, evaluate factors influencing ART change, second-line ART and consequent virologic and immunologic responses. Patients had follow-ups in a Central University Hospital, started ART between January 2007 and December 2012 and changed first regimens. Clinical data was obtained from medical records and analyzed using the Statistical Package for the Social Sciences (version 20).ResultsOf the 652 naive patients who started ART, 164 changed regimens. The majority had HIV-1 infection (n=158). The mean age was 43.9 years (standard deviation±14.3), with a male predominance of 57.9%. Regimens with efavirenz were the most common amongst HIV-1 patients (50%) followed by lopinavir/r (22%). In HIV-2 patients, lopinavir/r (n=3) regimens were most prevalent. First ART regimens had a mean duration of 12.1 months. There was no difference between NNRTI (59.8%) and protease inhibitor (40.2%) schemes regarding durability. Adverse reactions were the major cause of ART switching (55.5%) followed by therapy resistance (12.1%). Age was inversely related to durability (p=0.007 Mann-Whitney, Phi coefficient −0.161) and associated with the appearance of adverse reactions (p=0.04, Chi-square). Younger patients had a reduced risk of adverse reactions by 27%. Adverse reactions increased the risk of inferior durability by 40%. Psychiatric symptoms (28.4%) were the most prevalent, all attributed to efavirenz. The year of ART initiation was associated with different durability rates (p=0.005, Mann-Whitney). Patients started on ART before the year 2010 reduced the probability of inferior ART duration by 25.8%. After second-line ART regimens, TCD4+ counts>500 cell/µL were increased by 38% and favourable virologic outcome achieved in 84%.ConclusionsAdverse reactions were the main cause for ART switching, supporting a cautious approach when initiating regimens, particularly in older patients. All ART naive patients who changed initial therapy had favourable immunological and virologic responses.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.