Autonomous vehicles are complex systems that may behave in unexpected ways. From the drivers' perspective, this can cause stress and lower trust and acceptance of autonomous driving. Prior work has shown that explanation of system behavior can mitigate these negative effects. Nevertheless, it remains unclear in which situations drivers actually need an explanation and what kind of interaction is relevant to them. Using thematic analysis of real-world experience reports, we first identified 17 situations in which a vehicle behaved unexpectedly. We then conducted a think-aloud study (N = 26) in a driving simulator to validate these situations and enrich them with qualitative insights about drivers' need for explanation. We identified six categories to describe the main concerns and topics during unexpected driving behavior (emotion and evaluation, interpretation and reason, vehicle capability, interaction, future driving prediction and explanation request times). Based on these categories, we suggest design implications for autonomous vehicles, in particular related to collaboration insights, user mental models and explanation requests. CCS CONCEPTS • Human-centered computing → HCI theory, concepts and models; Empirical studies in HCI ; Scenario-based design.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.