2019
DOI: 10.1108/oir-10-2018-0332
|View full text |Cite
|
Sign up to set email alerts
|

Situated algorithms: a sociotechnical systemic approach to bias

Abstract: Purpose The purpose of this paper is to propose that in order to tackle the question of bias in algorithms, a systemic, sociotechnical and holistic perspective is needed. With reference to the term “algorithmic culture,” the interconnectedness and mutual shaping of society and technology are postulated. A sociotechnical approach requires translational work between and across disciplines. This conceptual paper undertakes such translational work. It exemplifies how gender and diversity studies, by bringing in ex… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
4
1

Relationship

2
8

Authors

Journals

citations
Cited by 43 publications
(25 citation statements)
references
References 34 publications
0
24
0
1
Order By: Relevance
“…According to such evidence, it is clear that algorithmic audits play a fundamental role in informing strategies for engaging both researchers and corporations in effectively addressing algorithmic bias. Mitigating algorithmic bias is a difficult task that requires a systemic, sociotechnical and holistic perspective [56].The publication of the Gender Shades study [18], for instance, not only played a significant role in sparking interest in gender and ethnicity classification but also motivated the research community to investigate bias and discrimination in facial analysis algorithms and systems in other domains, for instance, detecting bias against older adults in clinical settings [6] and bias against children in state-of-the-art pedestrian detection algorithms. In addition, it had a major positive effect on commercial gender classification services.…”
Section: Discussionmentioning
confidence: 99%
“…According to such evidence, it is clear that algorithmic audits play a fundamental role in informing strategies for engaging both researchers and corporations in effectively addressing algorithmic bias. Mitigating algorithmic bias is a difficult task that requires a systemic, sociotechnical and holistic perspective [56].The publication of the Gender Shades study [18], for instance, not only played a significant role in sparking interest in gender and ethnicity classification but also motivated the research community to investigate bias and discrimination in facial analysis algorithms and systems in other domains, for instance, detecting bias against older adults in clinical settings [6] and bias against children in state-of-the-art pedestrian detection algorithms. In addition, it had a major positive effect on commercial gender classification services.…”
Section: Discussionmentioning
confidence: 99%
“…Importantly, data justice and feminist scholarship have shown that some users will invariably appropriate the system in ‘other’ ways (e.g. D’Ignazio and Klein, 2020; Draude et al., 2019). 2 To understand how predictive analytics is reconfiguring education, it is important to also highlight ambivalences, and trace how predictive tools are embedded into unexpected or unforeseen practices.…”
Section: Implications and Uptakementioning
confidence: 99%
“…In der Nutzung von Social Machines treten solche Setzungen dann zumeist als technologische Eigengesetzlichkeiten auf (Code is Law) bzw. sind für die Nutzenden nicht verständlich oder sichtbar [9]. Diesem muss durch entsprechende Gestaltungsverfahren entgegengewirkt werden.…”
Section: (3) Daten Als Grundstoff Sozialen Zusammenhaltsunclassified