Recently, requirements for the explainability of software systems have gained prominence. One of the primary motivators for such requirements is that explainability is expected to facilitate stakeholders' trust in a system. Although this seems intuitively appealing, recent psychological studies indicate that explanations do not necessarily facilitate trust. Thus, explainability requirements might not be suitable for promoting trust.One way to accommodate this finding is, we suggest, to focus on trustworthiness instead of trust. While these two may come apart, we ideally want both: a trustworthy system and the stakeholder's trust. In this paper, we argue that even though trustworthiness does not automatically lead to trust, there are several reasons to engineer primarily for trustworthiness -and that a system's explainability can crucially contribute to its trustworthiness.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.