Objective This study aimed to organize the literature on cognitive aids to allow comparison of findings across studies and link the applied work of aid development to psychological constructs and theories of cognition. Background Numerous taxonomies have been developed, all of which label cognitive aids via their surface characteristics. This complicates integration of the literature, as a type of aid, such as a checklist, can provide many different forms of support (cf. prospective memory for steps and decision support for alternative diagnoses). Method In this synthesis of the literature, we address the disparate findings and organize them at their most basic level: Which cognitive processes does the aid need to support? Which processes do they support? Such processes include attention, perception, decision making, memory, and declarative knowledge. Results Cognitive aids can be classified into the processes they support. Some studies focused on how an aid supports the cognitive processes demanded by the task (aid function). Other studies focused on supporting the processes needed to utilize the aid (aid usability). Conclusion Classifying cognitive aids according to the processes they support allows comparison across studies in the literature and a formalized way of planning the design of new cognitive aids. Once the literature is organized, theory-based guidelines and applied examples can be used by cognitive aid researchers and designers. Application Aids can be designed according to the cognitive processes they need to support. Designers can be clear about their focus, either examining how to support specific cognitive processes or improving the usability of the aid.
The present study examined a diagnostic medical decision aid developed to help inexperienced operators to diagnose and treat a simulated patient. Diagnosis and treatment accuracy using the tool were assessed and compared across both physicians and non-physicians. Initial analysis revealed more accurate diagnostic and treatment choices for non-physicians, but upon further investigation, physicians were found to have recognized signs for another diagnosis and correctly diagnosed and treated based on the limited information in the patient simulation. This fit with other noted behaviors, such as non-physicians opening the diagnostic support tool within the aid more often than physicians, and frequently returning to the tool during the task. In general, non-physicians were supported in choosing the correct diagnosis and treatment by the aid, while physicians disregarded the aid’s recommendations to make decisions based on their own expertise. These results have implications for the development of future decision support aids for non-physicians performing medical procedures.
Guidelines are needed to help designers create cutting-edge cognitive aids for complex procedures, particularly when those aids support non-expert operators. To create guidelines, the factors that might interact with the aid design need be explored with non-experts, such as time pressure and the number of operators. We manipulated time pressure and number of team members in setting up a medical ventilator while using a cognitive aid. We measured dependence on the aid, performance, subjective workload, and team dynamics to better understand their influence on the use of a cognitive aid to accomplish a complex task by non-expert operators. Individuals reported a significantly higher mental workload level than teams and those under time pressure reported higher temporal workload. These data can contribute to design guidelines.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.