2019
DOI: 10.1111/zygo.12498
|View full text |Cite
|
Sign up to set email alerts
|

“White Crisis” and/as “Existential Risk,” or the Entangled Apocalypticism of Artificial Intelligence

Abstract: In this article, I present a critique of Robert Geraci's Apocalyptic artificial intelligence (AI) discourse, drawing attention to certain shortcomings which become apparent when the analytical lens shifts from religion to the race–religion nexus. Building on earlier work, I explore the phenomenon of existential risk associated with Apocalyptic AI in relation to “White Crisis,” a modern racial phenomenon with premodern religious origins. Adopting a critical race theoretical and decolonial perspective, I argue t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
8
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(8 citation statements)
references
References 43 publications
0
8
0
Order By: Relevance
“…In the formula, a(t) is the average dissimilarity or distance between the sample point t in cluster C and all other samples in the cluster [14].…”
Section: Smart Finance and Accounting Managementmentioning
confidence: 99%
“…In the formula, a(t) is the average dissimilarity or distance between the sample point t in cluster C and all other samples in the cluster [14].…”
Section: Smart Finance and Accounting Managementmentioning
confidence: 99%
“…Drawing on other scholars, however, I have shown that even in the ancient world not all communities we might label apocalyptic were on the social margins (2010, 26). Elsewhere, I have been criticized for insufficient attention to questions of race, ethnicity, and colonialism (Ali 2019). I have attempted to take such criticisms seriously and have shifted my approach (e.g., Geraci 2018, 20–3).…”
Section: A Hydra‐logical Approachmentioning
confidence: 99%
“…Second, Katz maintains that "AI serves the aims of whiteness -and thus is a tool in the arsenal of a white supremacist social order -but that it also mirrors the nebulous and shifting form of whiteness as an ideology" (155). While prosthetic and ideological readings of AI are quite plausible, I want to suggest they are not exhaustive of possibilities in terms of the relationship between AI and white supremacy; in my own work, for example, I interpret AI as an ontological refinement within the iterative logic of whiteness/white supremacy itself, one prompted by "White Crisis" (itself prompted by contestation of whiteness) and conceptualised in terms of shifts about "the line of the human" (Ali 2019(Ali , 2020. In short, rather than "AI [being] adapted, like whiteness, to challenges from social movements" (155), I want to suggest that AI is an adaptive iteration of whiteness itself.…”
mentioning
confidence: 99%