The inappropriate use of automation as a result of trust issues is a major barrier for a broad market penetration of automated vehicles. Studies so far have shown that providing information about the vehicle’s actions and intentions can be used to calibrate trust and promote user acceptance. However, how such feedback could be designed optimally is still an open question. This article presents the results of two user studies. In the first study, we investigated subjective trust and user experience of (N=21) participants driving in a fully automated vehicle, which interacts with other traffic participants in virtual reality. The analysis of questionnaires and semi-structured interviews shows that participants request feedback about the vehicle’s status and intentions and prefer visual feedback over other modalities. Consequently, we conducted a second study to derive concrete requirements for future feedback systems. We showed (N=56) participants various videos of an automated vehicle from the ego perspective and asked them to select elements in the environment they want feedback about so that they would feel safe, trust the vehicle, and understand its actions. The results confirm a correlation between subjective user trust and feedback needs and highlight essential requirements for automatic feedback generation. The results of both experiments provide a scientific basis for designing more adaptive and personalized in-vehicle interfaces for automated driving.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.