Increasingly healthcare policy and decision makers are demanding evidence to justify investments in health information systems. This demand requires an adequate evaluation of these systems. A wide variety of approaches and methodologies have been applied in assessing the impact of information systems in health care, ranging from controlled clinical trials to use of questionnaires and interviews with users. In this paper we describe methodological approaches which we have applied and refined for the past 10 years for the evaluation of health information systems. The approaches are strongly rooted in theories and methods from cognitive science and the emerging field of usability engineering. The focus is on assessing human computer interaction and in particular, the usability of computer systems in both laboratory and naturalistic settings. The methods described can be a part of the formative evaluation of systems during their iterative development, and can also complement more traditional assessment methods used in summative system evaluation of completed systems. The paper provides a review of the general area of systems evaluation with the motivation and rationale for methodological approaches underlying usability engineering and cognitive task analysis as applied to health information systems. This is followed by a detailed description of the methods we have applied in a variety of settings in conducting usability testing and usability inspection of systems such as computer-based patient records. Emerging trends in the evaluation of complex information systems are discussed.
Heuristic evaluation, when modified for medical devices, is a useful, efficient, and low cost method for evaluating patient safety features of medical devices through the identification of usability problems and their severities.
Promoting patient safety is a national priority. To evaluate interventions for reducing medical errors and adverse event, effective methods for detecting such events are required. This paper reviews the current methodologies for detection of adverse events and discusses their relative advantages and limitations. It also presents a cognitive framework for error monitoring and detection. While manual chart review has been considered the "gold-standard" for identifying adverse events in many patient safety studies, this methodology is expensive and imperfect. Investigators have developed or are currently evaluating, several electronic methods that can detect adverse events using coded data, free-text clinical narratives, or a combination of techniques. Advances in these systems will greatly facilitate our ability to monitor adverse events and promote patient safety research. But these systems will perform optimally only if we improve our understanding of the fundamental nature of errors and the ways in which the human mind can naturally, but erroneously, contribute to the problems that we observe.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.