Weaponised artificial intelligence (AI) and the prospective development of lethal autonomous weapon systems (LAWS) are topics that have sparked international debate on retaining human control over the use of force. This article unpacks China’s understanding of human–machine interaction to find that it encompasses many shades of grey. Specifically, despite repeatedly supporting a legal ban on LAWS, China simultaneously promotes a narrow understanding of these systems that intends to exclude such systems from what it deems “beneficial” uses of AI. We offer understandings of this ambivalent position by investigating how it is constituted through Chinese actors’ competing practices in the areas of economy, science and technology, defence, and diplomacy. Such practices produce normative understandings of human control and machine autonomy that pull China’s position on LAWS in different directions. We contribute to the scholarship bounded by norm research and international practice theories in examining how normativity originates in and emerges from diverse domestic contexts within competing practices. We also aim to provide insights into possible approaches whereby to achieve consensus in debates on regulating LAWS, which at the time of writing have reached a stalemate.
Kallenborn, 2021). The CEO of the system's manufacturer STM maintains that the use of autonomy in the Kargu-2 is primarily restricted to navigation, and that "[u]nless an operator pushes the button, it is not possible for the drone to select a target and attack" (Tavsan, 2021). Despite this ambiguity, the incident has been widely portrayed as the first battlefield use of autonomous weapon systems (AWS), often colloquially called "killer robots" (see Mizokami, 2021;Stanley, 2021;Vincent, 2021).AWS are defined as "any weapons that select and apply force to targets without human intervention" (ICRC, 2022). 1 Militaries throughout the world have demonstrated 1 In this article we generally refer to autonomous weapon systems (AWS). AWS are not a specific category of weapon. Rather, we understand AWS as being any type of weapon system which utilizes machine autonomy to select and apply force without immediate human control or intervention. While some autonomous weapons integrate AI elements into their critical functions, they may not all necessarily be based on AI technologies. We will only employ the term lethal autonomous weapon systems (LAWS) when specifically citing the discussion at the UN CCW, as this is the official term which states parties have used as part of this debate. The term "killer robots" is also commonly
to interpret legal and foreign policy instruments developed by the Chinese government, as well as other developing countries. Her work has been published in International Affairs, The Pacific Review, International Relations of the Asia Pacific, and Policy Studies, among others. She is the author of the book Promoting UN-ASEAN Coordination: Policy Transfer and Regional Cooperation Against Human Trafficking in Southeast Asia (Edward Elgar, forthcoming).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.