2023
DOI: 10.1016/j.chb.2022.107446
|View full text |Cite
|
Sign up to set email alerts
|

Let the user have a say - voice in automated decision-making

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
2

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 49 publications
0
2
0
Order By: Relevance
“…To address the power imbalance between platform workers and companies, the first step should involve increasing transparency and explainability within the platform economy (Rani and Furrer, 2021). Additionally, outcome control (in addition to outcome explanations) improves perceived justice by allowing people a voice and bringing human elements into final decisions (Hellwig et al, 2023; Lee et al, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…To address the power imbalance between platform workers and companies, the first step should involve increasing transparency and explainability within the platform economy (Rani and Furrer, 2021). Additionally, outcome control (in addition to outcome explanations) improves perceived justice by allowing people a voice and bringing human elements into final decisions (Hellwig et al, 2023; Lee et al, 2019).…”
Section: Discussionmentioning
confidence: 99%
“…The robot played only an advisory role. The final decision was made by the subjects themselves, as voice influences the fairness assessment of algorithm-based decision systems (Hellwig et al, 2023), and algorithm aversion should not be unnecessarily exacerbated (Burton et al, 2020;Bigman and Gray, 2018).…”
Section: The Recommendationsmentioning
confidence: 99%
“…Given the significant number of studies that have examined the perceived fairness of AI-based decisions (for overviews see Langer & Landers, 2021;Starke et al, 2022), these studies could also provide intuition as to what factors might influence people's perceived fairness and thus their response bias in detecting unfairness. To name just a few, the context of use (Lee, 2018), the stakes of the decisions (Langer et al, 2019), feature properties (e.g., their relevance, reliability; Grgić-Hlača et al, 2018; how sensitivity features are; Nyarko et al, 2021), the possibility to appeal algorithmic decisions (Hellwig et al, 2023), explanations (Lee et al, 2019;Newman et al, 2020;Schlicker et al, 2021), explanation styles (Binns et al, 2018;Dodge et al, 2019), and visualization techniques (Van Berkel et al, 2021) have been shown to affect the perceived fairness of system decisions and future research could test whether these effects translate into a change in response bias. It may also be interesting to examine whether perceived fairness mainly affects people's response bias or whether there is also a relationship with people's sensitivity.…”
Section: Response Biasmentioning
confidence: 99%