2018
DOI: 10.1177/2053951718756684
|View full text |Cite
|
Sign up to set email alerts
|

Understanding perception of algorithmic decisions: Fairness, trust, and emotion in response to algorithmic management

Abstract: Algorithms increasingly make managerial decisions that people used to make. Perceptions of algorithms, regardless of the algorithms' actual performance, can significantly influence their adoption, yet we do not fully understand how people perceive decisions made by algorithms as compared with decisions made by humans. To explore perceptions of algorithmic management, we conducted an online experiment using four managerial decisions that required either mechanical or human skills. We manipulated the decision-ma… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

19
599
5
13

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 701 publications
(636 citation statements)
references
References 57 publications
19
599
5
13
Order By: Relevance
“…On the other hand, this is in line with former research that suggested that feelings of uncertainty and ambiguity are especially prevalent in novel situations where people do not know what to do, what to feel, or how to judge the situations (Langer & König, ; Tene & Polonetsky, ). This might also account for the result that the low‐stake interview was perceived as being less ambiguous, as it might be more familiar to use automation for low‐stake scenarios such as training or work scheduling (see also Lee, ; Zyda, ). However, we were surprised by the fact that there was no difference in emotional creepiness between the groups, not even for automated high‐stake interviews.…”
Section: Discussionmentioning
confidence: 99%
See 4 more Smart Citations
“…On the other hand, this is in line with former research that suggested that feelings of uncertainty and ambiguity are especially prevalent in novel situations where people do not know what to do, what to feel, or how to judge the situations (Langer & König, ; Tene & Polonetsky, ). This might also account for the result that the low‐stake interview was perceived as being less ambiguous, as it might be more familiar to use automation for low‐stake scenarios such as training or work scheduling (see also Lee, ; Zyda, ). However, we were surprised by the fact that there was no difference in emotional creepiness between the groups, not even for automated high‐stake interviews.…”
Section: Discussionmentioning
confidence: 99%
“…Both studies suggest that people get less credit for their influence on the decision‐making process as soon as there are decision‐support systems involved, which also implies that people believe that human influence on decisions reduces with automatic decision‐support. In a similar vein, Lee () found that people react negatively to automatic decisions in personnel selection situations compared to automatic decisions in work scheduling situations and argued that this difference is due to the reduction of human influence on the decision. On the positive side, reducing the human influence on decisions might also be perceived to reduce possible biases and thus enhance consistency in the decision‐making process (which people seem to tend to believe when thinking about automatic decisions; Miller, Katz, & Gans, ).…”
Section: Background and Hypotheses Developmentmentioning
confidence: 96%
See 3 more Smart Citations