2023
DOI: 10.31234/osf.io/qtka5
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

What kind of Explainable AI do users need?

Alessandra Buccella,
Emma Chen,
Erik Linstead
et al.

Abstract: Artificial intelligence (AI) systems that make predictions, translate text, generate text and images, and much more are ubiquitous in modern life. Such AI systems typically perform tasks that humans cannot perform with the same degree of speed and efficiency, so they are increasingly integrated into human decision-making. But they also tend to be black boxes, in the sense that even experts in machine learning do not understand how they arrive at their output from their given inputs. In this paper, we argue tha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 43 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?