2022
DOI: 10.31234/osf.io/6az2h
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

I humanize, therefore I understand? Effects of explanations and humanization of intelligent systems on perceived and objective user understanding

Abstract: The functioning of intelligent systems can be opaque to users. Yet, users need to make informed choices about them. This work compares two knowledge mechanisms, i.e., ways for users to achieve an understanding of intelligent systems: explanation and humanization. In online experiment (N = 416), we compared the effects of a control condition without any explanation against a) aneutral and b) a humanized how-explanations as well as c) active humanization on (perceived and objective) user understanding and system… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 41 publications
0
1
0
Order By: Relevance
“…Gunning and Aha (2019), summarise three main needs for good explainable AI: (1) producing more explainable models, which would be necessary for a technical and social evaluation of an AI product (2) designing better explanation interfaces to facilitate interaction with the required knowledge, and (3) understanding the psychological requirements for providing effective explanations to humans, which will impact the opportunity for technically literate or non-technically literate individuals in a given society to participate in the evaluation of a tool. A paper by Ngo and Krämer (2022) concludes that explanations lead to a sense of misleading understanding.…”
Section: Transparencymentioning
confidence: 99%
“…Gunning and Aha (2019), summarise three main needs for good explainable AI: (1) producing more explainable models, which would be necessary for a technical and social evaluation of an AI product (2) designing better explanation interfaces to facilitate interaction with the required knowledge, and (3) understanding the psychological requirements for providing effective explanations to humans, which will impact the opportunity for technically literate or non-technically literate individuals in a given society to participate in the evaluation of a tool. A paper by Ngo and Krämer (2022) concludes that explanations lead to a sense of misleading understanding.…”
Section: Transparencymentioning
confidence: 99%