Autonomous weapons systems (AWS) are emerging as key technologies of future warfare. So far, academic debate concentrates on the legal-ethical implications of AWS but these do not capture how AWS may shape norms through defining diverging standards of appropriateness in practice. In discussing AWS, the article formulates two critiques on constructivist models of norm emergence: first, constructivist approaches privilege the deliberative over the practical emergence of norms; and second, they overemphasise fundamental norms rather than also accounting for procedural norms, which we introduce in this article. Elaborating on these critiques allows us to respond to a significant gap in research: we examine how standards of procedural appropriateness emerging in the development and usage of AWS often contradict fundamental norms and public legitimacy expectations. Normative content may therefore be shaped procedurally, challenging conventional understandings of how norms are constructed and considered as relevant in International Relations. In this, we outline the contours of a research programme on the relationship of norms and AWS, arguing that AWS can have fundamental normative consequences by setting novel standards of appropriate action in international security policy.
Research on norms in International Relations (IR) includes various concepts related to how norms influence actions. These approaches focus on the decision-making process, and largely neglect the operationalization of norms. This omission leads to an analytical gap: a lack of attention to how the substance of abstract norms is transformed and constructed in the operationalization process. This article draws on the Foucauldian theme of governmentality to introduce a novel perspective on operationalizing norms. It focusses in particular on the role of techniques as understudied parts inherent to the reflexive processes of operationalization and meaning production. The article thereby contests the prevalence of fundamental norms in conventional IR theory. It demonstrates, instead, that global governance techniques do not simply translate rationalities into practice, but construct their very own normativities. These theoretical reflections are illustrated by analysing the operationalization of norms through indicators in the case of the European Union’s human rights policy.
Kallenborn, 2021). The CEO of the system's manufacturer STM maintains that the use of autonomy in the Kargu-2 is primarily restricted to navigation, and that "[u]nless an operator pushes the button, it is not possible for the drone to select a target and attack" (Tavsan, 2021). Despite this ambiguity, the incident has been widely portrayed as the first battlefield use of autonomous weapon systems (AWS), often colloquially called "killer robots" (see Mizokami, 2021;Stanley, 2021;Vincent, 2021).AWS are defined as "any weapons that select and apply force to targets without human intervention" (ICRC, 2022). 1 Militaries throughout the world have demonstrated 1 In this article we generally refer to autonomous weapon systems (AWS). AWS are not a specific category of weapon. Rather, we understand AWS as being any type of weapon system which utilizes machine autonomy to select and apply force without immediate human control or intervention. While some autonomous weapons integrate AI elements into their critical functions, they may not all necessarily be based on AI technologies. We will only employ the term lethal autonomous weapon systems (LAWS) when specifically citing the discussion at the UN CCW, as this is the official term which states parties have used as part of this debate. The term "killer robots" is also commonly
The emergence of autonomous weapons systems (AWS) is increasingly in the academic and public focus. Research largely focuses on the legal and ethical implications of AWS as a new weapons category set to revolutionize the use of force. However, the debate on AWS neglects the question of what introducing these weapons systems could mean for how decisions are made. Pursuing this from a theoretical-conceptual perspective, the article critically analyzes what impact AWS can have on norms as standards of appropriate action. The article draws on the Foucauldian “apparatus of security” to develop a concept that accommodates the role of security technologies for the conceptualization of norms guiding the use of force. It discusses to what extent a technologically mediated construction of a normal reality emerges in the interplay of machinic and human agency and how this leads to the development of norms. The article argues that AWS provide a specific construction of reality in their operation and thereby define procedural norms that tend to replace the deliberative, normative-political decision on when, how, and why to use force. The article is a theoretical-conceptual contribution to the question of why AWS matter and why we should further consider the implications of new arrangements of human-machine interactions in IR.
The collection of eight articles in this special section provides insightful thinking points in the context of the political debate, the academic conversation, and the public interest in novel security technologies with autonomous features. These articles are also a call for further, empirically and theoretically informed research into the implications of AI for the international security dimension, specifically, and for societies more generally.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.