Background
Heuristic evaluations, while commonly used, may inadequately capture the severity of identified usability issues. In the domain of health care, usability issues can pose different levels of risk to patients. Incorporating diverse expertise (eg, clinical and patient) in the heuristic evaluation process can help assess and address potential negative impacts on patient safety that may otherwise go unnoticed. One document that should be highly usable for patients—with the potential to prevent adverse outcomes—is the after visit summary (AVS). The AVS is the document given to a patient upon discharge from the emergency department (ED), which contains instructions on how to manage symptoms, medications, and follow-up care.
Objective
This study aims to assess a multistage method for integrating diverse expertise (ie, clinical, an older adult care partner, and health IT) with human factors engineering (HFE) expertise in the usability evaluation of the patient-facing ED AVS.
Methods
We conducted a three-staged heuristic evaluation of an ED AVS using heuristics developed for use in evaluating patient-facing documentation. In stage 1, HFE experts reviewed the AVS to identify usability issues. In stage 2, 6 experts of varying expertise (ie, emergency medicine physicians, ED nurses, geriatricians, transitional care nurses, and an older adult care partner) rated each previously identified usability issue on its potential impact on patient comprehension and patient safety. Finally, in stage 3, an IT expert reviewed each usability issue to identify the likelihood of successfully addressing the issue.
Results
In stage 1, we identified 60 usability issues that violated a total of 108 heuristics. In stage 2, 18 additional usability issues that violated 27 heuristics were identified by the study experts. Impact ratings ranged from all experts rating the issue as “no impact” to 5 out of 6 experts rating the issue as having a “large negative impact.” On average, the older adult care partner representative rated usability issues as being more significant more of the time. In stage 3, 31 usability issues were rated by an IT professional as “impossible to address,” 21 as “maybe,” and 24 as “can be addressed.”
Conclusions
Integrating diverse expertise when evaluating usability is important when patient safety is at stake. The non-HFE experts, included in stage 2 of our evaluation, identified 23% (18/78) of all the usability issues and, depending on their expertise, rated those issues as having differing impacts on patient comprehension and safety. Our findings suggest that, to conduct a comprehensive heuristic evaluation, expertise from all the contexts in which the AVS is used must be considered. Combining those findings with ratings from an IT expert, usability issues can be strategically addressed through redesign. Thus, a 3-staged heuristic evaluation method offers a framework for integrating context-specific expertise efficiently, while providing practical insights to guide human-centered design.