SUMMARYBackground.Bloodstream infection (BSI) is a major cause of morbidity and mortality throughout the world. Rapid identification of bloodstream pathogens is a laboratory practice that supports strategies for rapid transition to direct targeted therapy by providing for timely and effective patient care. In fact, the more rapidly that appropriate antimicrobials are prescribed, the lower the mortality for patients with sepsis. Rapid identification methods may have multiple positive impacts on patient outcomes, including reductions in mortality, morbidity, hospital lengths of stay, and antibiotic use. In addition, the strategy can reduce the cost of care for patients with BSIs.Objectives.The purpose of this review is to evaluate the evidence for the effectiveness of three rapid diagnostic practices in decreasing the time to targeted therapy for hospitalized patients with BSIs. The review was performed by applying the Centers for Disease Control and Prevention's (CDC's) Laboratory Medicine Best Practices Initiative (LMBP) systematic review methods for quality improvement (QI) practices and translating the results into evidence-based guidance (R. H. Christenson et al., Clin Chem 57:816–825, 2011, http://dx.doi.org/10.1373/clinchem.2010.157131).Search strategy.A comprehensive literature search was conducted to identify studies with measurable outcomes. A search of three electronic bibliographic databases (PubMed, Embase, and CINAHL), databases containing “gray” literature (unpublished academic, government, or industry evidence not governed by commercial publishing) (CIHI, NIHR, SIGN, and other databases), and the Cochrane database for English-language articles published between 1990 and 2011 was conducted in July 2011.Dates of search.The dates of our search were from 1990 to July 2011.Selection criteria.Animal studies and non-English publications were excluded. The search contained the following medical subject headings: bacteremia; bloodstream infection; time factors; health care costs; length of stay; morbidity; mortality; antimicrobial therapy; rapid molecular techniques, polymerase chain reaction (PCR); in situ hybridization, fluorescence; treatment outcome; drug therapy; patient care team; pharmacy service, hospital; hospital information systems; Gram stain; pharmacy service; and spectrometry, mass, matrix-assisted laser desorption-ionization. Phenotypic as well as the following key words were searched: targeted therapy; rapid identification; rapid; Gram positive; Gram negative; reduce(ed); cost(s); pneumoslide; PBP2; tube coagulase; matrix-assisted laser desorption/ionization time of flight; MALDI TOF; blood culture; EMR; electronic reporting; call to provider; collaboration; pharmacy; laboratory; bacteria; yeast; ICU; and others. In addition to the electronic search being performed, a request for unpublished quality improvement data was made to the clinical laboratory community.Main results.Rapid molecular testing with direct communication significantly improves timeliness compared to standard testing. Rapid ...
SUMMARYBackground.Urinary tract infection (UTI) in the United States is the most common bacterial infection, and urine cultures often make up the largest portion of workload for a hospital-based microbiology laboratory. Appropriately managing the factors affecting the preanalytic phase of urine culture contributes significantly to the generation of meaningful culture results that ultimately affect patient diagnosis and management. Urine culture contamination can be reduced with proper techniques for urine collection, preservation, storage, and transport, the major factors affecting the preanalytic phase of urine culture.Objectives.The purposes of this review were to identify and evaluate preanalytic practices associated with urine specimens and to assess their impact on the accuracy of urine culture microbiology. Specific practices included collection methods for men, women, and children; preservation of urine samples in boric acid solutions; and the effect of refrigeration on stored urine. Practice efficacy and effectiveness were measured by two parameters: reduction of urine culture contamination and increased accuracy of patient diagnosis. The CDC Laboratory Medicine Best Practices (LMBP) initiative's systematic review method for assessment of quality improvement (QI) practices was employed. Results were then translated into evidence-based practice guidelines.Search strategy.A search of three electronic bibliographic databases (PubMed, SCOPUS, and CINAHL), as well as hand searching of bibliographies from relevant information sources, for English-language articles published between 1965 and 2014 was conducted.Selection criteria.The search contained the following medical subject headings and key text words: urinary tract infections, UTI, urine/analysis, urine/microbiology, urinalysis, specimen handling, preservation, biological, preservation, boric acid, boric acid/borate, refrigeration, storage, time factors, transportation, transport time, time delay, time factor, timing, urine specimen collection, catheters, indwelling, urinary reservoirs, continent, urinary catheterization, intermittent urethral catheterization, clean voided, midstream, Foley, suprapubic, bacteriological techniques, and microbiological techniques.Main results.Both boric acid and refrigeration adequately preserved urine specimens prior to their processing for up to 24 h. Urine held at room temperature for more than 4 h showed overgrowth of both clinically significant and contaminating microorganisms. The overall strength of this body of evidence, however, was rated as low. For urine specimens collected from women, there was no difference in rates of contamination for midstream urine specimens collected with or without cleansing. The overall strength of this evidence was rated as high. The levels of diagnostic accuracy of midstream urine collection with or without cleansing were similar, although the overall strength of this evidence was rated as low. For urine specimens collected from men, there was a reduction in contamination in favor of mi...
Blood culture contamination greatly affects clinical decisions. Hence, it is of interest to assess the influence of factors such as the volume of blood drawn and the site of blood draw on the rates of blood culture contamination. In a retrospective study, blood cultures from infants and children up to 18 years of age who had at least one positive blood culture during the year 2006 were analyzed for their volume of blood drawn, patient's weight, site of blood draw used, and blood culture results. Blood cultures were deemed adequate collections if they contained an appropriate weight-related volume of blood. Moreover, blood culture results were categorized as true pathogens, contaminants, and negative cultures; these were then compared and analyzed with respect to their volume and site of blood draw. A total of 5,023 blood cultures were collected during 2006, of which 843 were analyzed. There were 306 (36%) positive cultures among the 843 cultures analyzed. Of the 306 positive cultures, 98 (32%) were contaminants and 208 (68%) cultures grew significant pathogens. Thirty-five percent of the contaminant cultures had adequate volume compared to 60% in the true bacteremia group (P < 0.001). Also, of the 843 cultures, the rates of contamination among the different sites of blood draw were as follows: peripheral venipuncture, 36%; arterial, 10%; and central venous access, 7% (P ؍ 0.155). The rate of contamination was higher with lower blood volumes, and there was no significant difference in the rates of contamination among the different sites of blood draw.Blood cultures are vital for identifying pathogens causing serious infections and in directing appropriate antibiotic therapy. Moreover, they remain the standard method for detecting bacteremia in the evaluation of sick patients (14). Unfortunately, blood culture contamination is a common occurrence and may lead to confusion regarding the significance of a positive blood culture. The most common contaminants are coagulase-negative staphylococcus species which are also becoming more prevalent as a primary pathogen in immunocompromised patients and in patients with indwelling intravascular devices (9, 15). The uncertain clinical significance of potential contaminants leads to longer hospital stays, unnecessary antibiotic therapy, and additional laboratory testing; as a result, the cost incurred by a hospital is many times that incurred by the laboratory (2).Many factors influence the yield of blood cultures, but the single most important factor is blood volume. Several studies have shown that the rate of isolation of pathogens from blood cultures increases with the quantity of blood submitted (12). Hence, a blood culture may be falsely negative from an inadequate-volume blood culture (6). Furthermore, the blood culture contamination rate inversely correlates with the volume of blood (3). The site and method of blood collection have also been known to influence the rate of contamination of blood cultures (8). Vascular-access devices, such as arterial and central venou...
The VITEK 2 and Phoenix extended-spectrum -lactamase (ESBL) detection systems, which comprise confirmatory tests and expert systems, were evaluated for their ability to discriminate between 102 wellcharacterized strains of ESBL-positive or -negative Escherichia coli, Klebsiella pneumoniae, and Klebsiella oxytoca. At least 38 distinct ESBLs were included. The strains were chosen to include some known to cause false-positive and false-negative CLSI ESBL confirmatory test results. Therefore, enzyme characterizations, rather than CLSI tests, were the reference methods for the Phoenix and VITEK 2 evaluations. A third arm of the study was conducted with the Phoenix test using two normally inactive expert rules intended to enhance ESBL detection, in addition to using the currently available software. The Phoenix ESBL confirmatory test and unmodified expert system exhibited 96% sensitivity and 81% specificity for ESBL detection. Activation of the two additional rules increased sensitivity to 99% but reduced the specificity to 58%. The VITEK 2 ESBL confirmatory test exhibited 91% sensitivity, which was reduced to 89% sensitivity by its expert system, while its specificity was 85%. Many of the expert system interpretations of both instruments were helpful, but some were suboptimal. The VITEK 2 expert system was potentially more frustrating because it provided more inconclusive interpretations of the results. Considering the high degree of diagnostic difficulty posed by the strains, both ESBL confirmatory tests were highly sensitive. The expert systems of both instruments require modification to update and enhance their utility.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.