Background and objective The clinical note documents the clinician's information collection, problem assessment, clinical management, and its used for administrative purposes. Electronic health records (EHRs) are being implemented in clinical practices throughout the USA yet it is not known whether they improve the quality of clinical notes. The goal in this study was to determine if EHRs improve the quality of outpatient clinical notes.Materials and methods A five and a half year longitudinal retrospective multicenter quantitative study comparing the quality of handwritten and electronic outpatient clinical visit notes for 100 patients with type 2 diabetes at three time points: 6 months prior to the introduction of the EHR (before-EHR), 6 months after the introduction of the EHR (after-EHR), and 5 years after the introduction of the EHR (5-year-EHR). QNOTE, a validated quantitative instrument, was used to assess the quality of outpatient clinical notes. Its scores can range from a low of 0 to a high of 100. Sixteen primary care physicians with active practices used QNOTE to determine the quality of the 300 patient notes.Results The before-EHR, after-EHR, and 5-year-EHR grand mean scores (SD) were 52.0 (18.4), 61.2 (16.3), and 80.4 (8.9), respectively, and the change in scores for before-EHR to after-EHR and before-EHR to 5-year-EHR were 18% (p<0.0001) and 55% (p<0.0001), respectively. All the element and grand mean quality scores significantly improved over the 5-year time interval.Conclusions The EHR significantly improved the overall quality of the outpatient clinical note and the quality of all its elements, including the core and non-core elements. To our knowledge, this is the first study to demonstrate that the EHR significantly improves the quality of clinical notes.
Background and objectiveThe outpatient clinical note documents the clinician's information collection, problem assessment, and patient management, yet there is currently no validated instrument to measure the quality of the electronic clinical note. This study evaluated the validity of the QNOTE instrument, which assesses 12 elements in the clinical note, for measuring the quality of clinical notes. It also compared its performance with a global instrument that assesses the clinical note as a whole.Materials and methodsRetrospective multicenter blinded study of the clinical notes of 100 outpatients with type 2 diabetes mellitus who had been seen in clinic on at least three occasions. The 300 notes were rated by eight general internal medicine and eight family medicine practicing physicians. The QNOTE instrument scored the quality of the note as the sum of a set of 12 note element scores, and its inter-rater agreement was measured by the intraclass correlation coefficient. The Global instrument scored the note in its entirety, and its inter-rater agreement was measured by the Fleiss κ.ResultsThe overall QNOTE inter-rater agreement was 0.82 (CI 0.80 to 0.84), and its note quality score was 65 (CI 64 to 66). The Global inter-rater agreement was 0.24 (CI 0.19 to 0.29), and its note quality score was 52 (CI 49 to 55). The QNOTE quality scores were consistent, and the overall QNOTE score was significantly higher than the overall Global score (p=0.04).ConclusionsWe found the QNOTE to be a valid instrument for evaluating the quality of electronic clinical notes, and its performance was superior to that of the Global instrument.
This study suggests that physician decision making can be influenced by safety and cost information and the order in which information is provided to physicians can affect their decisions.
Background and Objectives: Collection of feedback regarding medical student clinical experiences for formative or summative purposes remains a challenge across clinical settings. The purpose of this study was to determine whether the use of a quick response (QR) code-linked online feedback form improves the frequency and efficiency of rater feedback. Methods: In 2016, we compared paper-based feedback forms, an online feedback form, and a QR code-linked online feedback form at 15 family medicine clerkship sites across the United States. Outcome measures included usability, number of feedback submissions per student, number of unique raters providing feedback, and timeliness of feedback provided to the clerkship director. Results: The feedback method was significantly associated with usability, with QR code scoring the highest, and paper second. Accessing feedback via QR code was associated with the shortest time to prepare feedback. Across four rotations, separate repeated measures analyses of variance showed no effect of feedback system on the number of submissions per student or the number of unique raters. Conclusions: The results of this study demonstrate that preceptors in the family medicine clerkship rate QR code-linked feedback as a high usability platform. Additionally, this platform resulted in faster form completion than paper or online forms. An overarching finding of this study is that feedback forms must be portable and easily accessible. Potential implementation barriers and the social norm for providing feedback in this manner need to be considered.
Introduction:The steady increase in the use of computed tomography (CT) has particular concerns for children. Family physicians must often select pediatric imaging without any decision support. We hypothesized that point-of-care decision support would lead to the selection of imaging that lowered radiation exposure and improved guideline congruence.Methods: Our double-blind, randomized simulation included family physicians in the Military Health System. Participants initially reviewed a pediatric hematuria scenario and selected imaging without decision support. Participants were subsequently randomized to either receive imaging-appropriateness guidelines and then estimated radiation exposure information or receive estimated radiation information then guidelines; imaging selections were required after each step. The primary outcome was the selected imaging modality with point-of-care decision support.Results: The first arm increased CT ordering after viewing the guidelines (P ؍ .008) but then decreased it after reviewing radiation exposure information (P ؍ .007). In the second arm radiation information decreased CT and plain film use (P ؍ not significant), with a subsequent increase in ultrasound and CT after the guideline presentation (P ؍ .05).Conclusions: Decision support during a simulated pediatric scenario helped family physicians select imaging that lowered radiation exposure and was aligned with current guidelines, especially when presented with radiation information after guideline review. This information could help inform electronic medical record design.(J Am Board Fam Med 2015;28:343-350.)
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.