2021
DOI: 10.7146/kkf.v29i2.124899
|View full text |Cite
|
Sign up to set email alerts
|

Towards Decolonising Computational Sciences

Abstract: This article sets out our perspective on how to begin the journey of decolonising computational fi elds, such as data and cognitive sciences. We see this struggle as requiring two basic steps: a) realisation that the present-day system has inherited, and still enacts, hostile, conservative, and oppressive behaviours and principles towards women of colour; and b) rejection of the idea that centring individual people is a solution to system-level problems. The longer we ignore these two steps, the more “our” aca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
45
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 38 publications
(46 citation statements)
references
References 39 publications
1
45
0
Order By: Relevance
“…When we see bias and discrimination, what we see is problems that have surfaced as a result of a field that has thoughtlessly inherited deeply rooted unjust, racist, and white supremacist histories and practices. 56 As D'Ignazio and Klein 42 contend, ''addressing bias in a dataset is a tiny technological Band-Aid for a much larger problem.'' Furthermore, underlying the idea of ''fixing'' bias is the assumption that there exists a single correct description of reality where a deviation from it has resulted in bias.…”
Section: Knowing That Centers Human Relationsmentioning
confidence: 99%
“…When we see bias and discrimination, what we see is problems that have surfaced as a result of a field that has thoughtlessly inherited deeply rooted unjust, racist, and white supremacist histories and practices. 56 As D'Ignazio and Klein 42 contend, ''addressing bias in a dataset is a tiny technological Band-Aid for a much larger problem.'' Furthermore, underlying the idea of ''fixing'' bias is the assumption that there exists a single correct description of reality where a deviation from it has resulted in bias.…”
Section: Knowing That Centers Human Relationsmentioning
confidence: 99%
“…And on the other hand, we need to appreciate how these concepts of difference (sex, gender, class, and “race”) are intersectionally intertwined with each other. For interdisciplinary or rather intersectional approaches, we need to ask to what extent the categories of “race” or class have found their way into the statistical measurement strategies of contemporary brain research ( Abiodun 2019 ; Birhane and Guest 2020 ; Rollins 2021 ).…”
Section: Discussionmentioning
confidence: 99%
“…In a similar vein, Birhane and Guest (2020) argue against the tokenistic appointment of a few black men and women in positions of influence without changing and challenging the underlying structures that uphold current systems of power. They therefore argue for a gradual process of decolonization through looking at the history of the disciplines involved in the production of new technologies such as AI, and an opening up of the field to those who challenge established orthodoxies (Birhane & Guest, 2020). While Birhane and Guest propose a historically nuanced understanding, Mohamed et al (2020) look to the future, proposing the development and integration of foresight in the design of technological systems, such that it can recognize and preempt any systemic discrimination.…”
Section: Ai and Content Moderation: A Decolonial Approachmentioning
confidence: 99%
“…What may be an alternative position? In the final section of the article, following Quijano (2007a, b) and more recent works by Benjamin (2019), Birhane and Guest (2020), Mohamed et al (2020) and Adams (2021), we develop a sketch for a decolonial approach to AI in content moderation.…”
Section: Introduction: Race and Content Moderationmentioning
confidence: 99%