2017
DOI: 10.31228/osf.io/97upg
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Slave to the Algorithm? Why a 'right to an explanation' is probably not the remedy you are looking for

Abstract: Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals' lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a "right to an explanation" has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic "black box" to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
195
0
13

Year Published

2017
2017
2022
2022

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 211 publications
(209 citation statements)
references
References 1 publication
1
195
0
13
Order By: Relevance
“…Do we want to understand all the patterns the machine has learnt (model-centric explanations)? Or just those that are relevant to the patient (subject-centric explanations)?22 The former aims to provide global understanding about the relative importance of all variables and how they interact to make predictions, which may shed new light on disease mechanisms; the latter provides local understanding about why this particular input led to that particular output, which could be relevant for individual patient prognosis.…”
Section: Right To Explanation?mentioning
confidence: 99%
“…Do we want to understand all the patterns the machine has learnt (model-centric explanations)? Or just those that are relevant to the patient (subject-centric explanations)?22 The former aims to provide global understanding about the relative importance of all variables and how they interact to make predictions, which may shed new light on disease mechanisms; the latter provides local understanding about why this particular input led to that particular output, which could be relevant for individual patient prognosis.…”
Section: Right To Explanation?mentioning
confidence: 99%
“…They have no way of knowing whether or not Facebook will respond to an individual complaint, how it will respond, and whether a human will be tasked with any such response. While there are currently debates about whether the new EU GDPR may provide a "right to explanation" for algorithmic decisions (Goodman & Flaxman, 2016), it is entirely unclear what this right would actually look like in practice (Wachter, Mittelstadt, & Floridi, 2016), and whether it would be fit for the purposes discussed here (Edwards & Veale, 2017). Despite this, it is important to acknowledge that the GDPR provides numerous additional rights to users which should contribute to shedding greater light on algorithmic decision making (Article 29 Data Protection Working Party, 2017).…”
Section: Outsourced Facebook Content Moderationmentioning
confidence: 99%
“…The interface scaffold is critical if the learner is to really understand their model by scrutinising each part of it. This guideline aims to avoid the risk that "Transparency can privilege seeing over understanding" (Ananny & Crawford, 2018;Edwards & Veale, 2017).…”
Section: Discussion Guidelines For Creating Pumlsmentioning
confidence: 99%