Trust is an important factor in building acceptance of autonomous vehicles within our society, but the complex nature of trust makes it challenging to design for an appropriate level of trust. This can lead to instances of mistrust and/or distrust between users and AV’s. Designing for calibrated trust is a possible option to address this challenge. Existing research on designing for calibrated trust focuses on the human machine interaction (HMI), while from literature we infer that trust creation beings much before the first interaction between a user and an AV. The goal of our research is to broaden the scope of calibrated trust, by exploring the pre-use phase and understand the challenges faced in calibration of trust. Within our study 16 mobility experts were interviewed and a thematic analysis of the interviews was conducted. The analysis revealed the lack of clear communication between stakeholders, a solutionism approach towards designing and lack of transparency in design as the prominent challenges. Building on the research insights, we briefly introduce the Calibrated Trust Toolkit as our design solution, and conclude by proposing a sweet spot for achieving calibration of trust between users and autonomous vehicles.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.