We performed a human-in-the-loop study to explore the role of transparency in engendering trust and reliance within highly automated systems. Specifically, we examined how transparency impacts trust in and reliance upon the Autonomous Constrained Flight Planner (ACFP), a critical automated system being developed as part of NASA's Reduced Crew Operations (RCO) Concept. The ACFP is designed to provide an enhanced ground operator, termed a super dispatcher, with recommended diversions for aircraft when their primary destinations are unavailable. In the current study, 12 commercial transport rated pilots who played the role of super dispatchers were given six time-pressured "all land" scenarios where they needed to use the ACFP to determine diversions for multiple aircraft. Two factors were manipulated. The primary factor was level of transparency. In low transparency scenarios the pilots were given a recommended airport and runway, plus basic information about the weather conditions, the aircraft types, and the airport and runway characteristics at that and other airports. In moderate transparency scenarios the pilots were also given a risk evaluation for the recommended airport, and for the other airports if they requested it. In the high transparency scenario additional information including the reasoning for the risk evaluations was made available to the pilots. The secondary factor was level of risk, either high or low. For high-risk aircraft, all potential diversions were rated as highly risky, with the ACFP giving the best option for a bad situation. For low-risk aircraft the ACFP found only low-risk options for the pilot. Both subjective and objective measures were collected, including rated trust, whether the pilots checked the validity of the automation recommendation, and whether the pilots eventually flew to the recommended diversion airport. Key results show that: 1) Pilots' trust increased with higher levels of transparency, 2) Pilots were more likely to verify ACFP's recommendations with low levels of transparency and when risk was high, 3) Pilots were more likely to explore other options from the ACFP in low transparency conditions and when risk was high, and 4) Pilots' decision to accept or reject ACFP's recommendations increased as a function of the transparency in the explanation. The finding that higher levels of transparency was coupled with higher levels of trust, a lower need to verify other options, and higher levels of agreement with ACFP recommendations, confirms the importance of transparency in aiding reliance on automated recommendations. Additional analyses of qualitative data gathered from subjects through surveys and during debriefing interviews also provided the basis for new design recommendations for the ACFP.
In this paper we describe results from the first year of field study examining U.S. Air Force (USAF) F-16 pilots' trust of the Automatic Ground Collision Avoidance System (Auto-GCAS). Using semistructured interviews focusing on opinion development and evolution, system transparency and understanding, the pilotvehicle interface, stories and reputation, usability, and the impact on behavior, we identified factors positively and negatively influencing trust with data analysis methods based in grounded theory. Overall, Auto-GCAS is an effective life-/aircraft-saving technology and is generally well received and trusted appropriately, with trust evolving based on factors including having a healthy skepticism of the system, attributing system faults to hardware problems, and having trust informed by reliable performance (e.g., lives saved). Unanticipated findings included pilots reporting reputation to not be negatively affected by system activations and an interface anticipation cue having the potential to change operational flight behavior. We discuss emergent research avenues in areas of transparency and culture, and values of conducting trust research with operators of realworld systems having high levels of autonomy.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.