2020
DOI: 10.2139/ssrn.3548314
|View full text |Cite
|
Sign up to set email alerts
|

The (Erroneous) Requirement for Human Judgment (and Error) in the Law of Armed Conflict

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…The ICRC and a number of States, for instance, have concluded that human control must always be present, to prevent any accountability gaps: "combatants have a unique obligation to make the judgements required of them by the [IHL] rules governing the conduct of hostilities, and this responsibility cannot be transferred to a machine, a piece of software or an algorithm". 128 The same conclusion was reached by all High Contracting Parties to the Convention on Certain Conventional Weapons. 129 Conversely, a number of States and commentators have taken the view that such accountability concerns are not particularly significant, and that it is sufficient that there is some "appropriate level of human judgment" in the deployment of autonomous weapons, without specifying where that judgement need necessarily be exercised.…”
Section: Principle Of Individual Responsibilitymentioning
confidence: 59%
“…The ICRC and a number of States, for instance, have concluded that human control must always be present, to prevent any accountability gaps: "combatants have a unique obligation to make the judgements required of them by the [IHL] rules governing the conduct of hostilities, and this responsibility cannot be transferred to a machine, a piece of software or an algorithm". 128 The same conclusion was reached by all High Contracting Parties to the Convention on Certain Conventional Weapons. 129 Conversely, a number of States and commentators have taken the view that such accountability concerns are not particularly significant, and that it is sufficient that there is some "appropriate level of human judgment" in the deployment of autonomous weapons, without specifying where that judgement need necessarily be exercised.…”
Section: Principle Of Individual Responsibilitymentioning
confidence: 59%
“…As an illustration, it has been argued that "[m]eaningful human control over the use of weapons is consistent with and promotes compliance with the principles of international humanitarian law, notably distinction and proportionality" (Human Rights Watch 2016) or that "human control over AI and machine-learning applications employed as means and methods of warfare is required to ensure compliance with the law" (ICRC 2019a). These statements are unfalsifiable in a legal sense until the exact contents of the 'human control' requirement are delineated first; as they are, they "add little substance to the discussion until States can either come to agreement or develop law through practice" (Jensen 2020). As it stands, MHC remains undefined-and thus unworkable-in international law (Chengeta 2016).…”
Section: The Need For a Working Framework Of Mhcmentioning
confidence: 99%
“…As militaries (Defense Science Board 2012; Knight 2019; Ministère des Armées (France) 2019) announced interest in harnessing artificial intelligence (AI) to allow weapon systems to make increasingly complex and independent decisions on the battlefield, the initial reverberations were significant: authors (Sparrow 2007(Sparrow , 2016Asaro 2012) NGOs (Campaign to Stop Killer Robots 2012; Fleming 2009; Future of Life Institute 2015) and international organisations (Beerli 2014;Heyns 2013;ICRC 2013) swiftly called for discussions or even outright bans of the technology, and an intergovernmental panel was established under the auspices of the Convention on Certain Conventional Weapons (CCW) to discuss the appropriate international response (Simon- Michel 2014). More than eight years on, the output of the CCW panel has been poor, its discussions seemingly marred by semantics, a lack of focus, and conflicting political positions (Jensen 2020;Schuller 2017). 1 One positive aspect of the CCW discussions is that it placed the notion of Meaningful Human Control (MHC) unto the international agenda.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation