We performed a human-in-the-loop study to explore the role of transparency in engendering trust and reliance within highly automated systems. Specifically, we examined how transparency impacts trust in and reliance upon the Autonomous Constrained Flight Planner (ACFP), a critical automated system being developed as part of NASA's Reduced Crew Operations (RCO) Concept. The ACFP is designed to provide an enhanced ground operator, termed a super dispatcher, with recommended diversions for aircraft when their primary destinations are unavailable. In the current study, 12 commercial transport rated pilots who played the role of super dispatchers were given six time-pressured "all land" scenarios where they needed to use the ACFP to determine diversions for multiple aircraft. Two factors were manipulated. The primary factor was level of transparency. In low transparency scenarios the pilots were given a recommended airport and runway, plus basic information about the weather conditions, the aircraft types, and the airport and runway characteristics at that and other airports. In moderate transparency scenarios the pilots were also given a risk evaluation for the recommended airport, and for the other airports if they requested it. In the high transparency scenario additional information including the reasoning for the risk evaluations was made available to the pilots. The secondary factor was level of risk, either high or low. For high-risk aircraft, all potential diversions were rated as highly risky, with the ACFP giving the best option for a bad situation. For low-risk aircraft the ACFP found only low-risk options for the pilot. Both subjective and objective measures were collected, including rated trust, whether the pilots checked the validity of the automation recommendation, and whether the pilots eventually flew to the recommended diversion airport. Key results show that: 1) Pilots' trust increased with higher levels of transparency, 2) Pilots were more likely to verify ACFP's recommendations with low levels of transparency and when risk was high, 3) Pilots were more likely to explore other options from the ACFP in low transparency conditions and when risk was high, and 4) Pilots' decision to accept or reject ACFP's recommendations increased as a function of the transparency in the explanation. The finding that higher levels of transparency was coupled with higher levels of trust, a lower need to verify other options, and higher levels of agreement with ACFP recommendations, confirms the importance of transparency in aiding reliance on automated recommendations. Additional analyses of qualitative data gathered from subjects through surveys and during debriefing interviews also provided the basis for new design recommendations for the ACFP.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.