To examine interface issues in an educational software program, we recruited twenty-two users in health sciences departments for a usability study. Using the GramStain-Tutor TM (GST) CD-ROM, these users performed tasks in navigating the program and using various interactive features. Their actions and comments were recorded with a digital video recorder and an audio recorder. The main results of our study are: (a) the majority of the users did not use the multiple navigational options available in the program; (b) navigational patterns differed based on the training background and content knowledge of the users; (c) several visual cues critical to program use were not discovered by the majority of the users; and (d) icons representing specific functions were not intuitive from users' perspectives. We demonstrated how a usability study of a relatively small number of users can identify specific problems in interface design. We recommend these interface problems be addressed before conducting educational studies that examine how educational software programs affect student learning. Usability testingUsability testing is a collection of evaluation methods that is increasing in popularity. It is based on observing how users perform tasks, and obtaining feedback from users through formal tests or compilations of user comments (Nielson, 1993). These results lead to a set of recommendations that are fed back into the design process with the end goal of improving the product. Thus, usability is an iterative process of collecting response data or user feedback on a product, re-designing the product in response to that feedback, and re-testing the product to examine whether changes have enhanced the usability.In a typical usability session, the following issues are explored (Nielsen, 1993;Salvemini, 1998):• Can users easily learn a computer system supported by sufficient navigational control? • Is the system efficient to use or is usability hindered by an excessive number of icons and menu options? • Are system features easy to remember so that time to re-learn the system is minimized? • Does the system have low errors and efficient error recovery mechanisms? • Is the system pleasant to use?These questions can be answered by having users complete tasks while using the computer system. Tasks refer to a set of representative actions routinely performed in a particular working context. Often, users of different levels of content expertise and computer skills are asked to carry out these tasks. Using as few as 4-5 subjects who participate in a task-based study, 80% of usability problems can be identified (Virzi, 1992).Usability techniques include both qualitative and quantitative approaches, such as observations, think aloud protocols, questionnaires, interviews, logging actual use, and user feedback through debriefing (Nielson, 1993). Typically, a checklist, audio recorders and video recorders are used in observation sessions. Users are encouraged to articulate their thought processes while performing tasks. This "think-aloud...
Background: The microscopic examination of urine sediment is one of the most commonly performed microscope-based laboratory tests, but despite its widespread use, there has been no detailed study of the competency of medical technologists in performing this test. One reason for this is the lack of an effective competency assessment tool that can be applied uniformly across an institution. Methods: This study describes the development and implementation of a computer program, Urinalysis-ReviewTM, which periodically tests competency in microscopic urinalysis and then summarizes individual and group test results. In this study, eight Urinalysis-Review exams were administered over 2 years to medical technologists (mean, 58 technologists per exam; range, 44–77) at our academic medical center. The eight exams contained 80 test questions, consisting of 72 structure identification questions and 8 quantification questions. The 72 structure questions required the identification of 134 urine sediment structures consisting of 63 examples of cells, 25 of casts, 18 of normal crystals, 8 of abnormal crystals, and 20 of organisms or artifacts. Results: Overall, the medical technologists correctly identified 84% of cells, 72% of casts, 79% of normal crystals, 65% of abnormal crystals, and 81% of organisms and artifacts, and correctly answered 89% of the quantification questions. The results are probably a slight underestimate of competency because the images were analyzed without the knowledge of urine chemistry results. Conclusions: The study shows the feasibility of using a computer program for competency assessment in the clinical laboratory. In addition, the study establishes baseline measurements of competency that other laboratories can use for comparison, and which we will use in future studies that measure the effect of continuing education efforts in microscopic urinalysis.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.