• No results found

Militarization of new technologies: learning from the past to regulate Autonomous Weapons Systems

N/A
N/A
Protected

Academic year: 2021

Share "Militarization of new technologies: learning from the past to regulate Autonomous Weapons Systems"

Copied!
50
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Master’s thesis

Militarization of new technologies: learning from the past to regulate

Autonomous Weapons Systems

Adequacy and relevance of legal reviewing, hard law and soft law towards Autonomous Weapon Systems regulation

Master track: Public International Law Supervisor: Jeroen Van den Boogaard

Word count: 12741 words Date of submission: 20/06/2020

Name: Marie Porchet Student number: 12809934 E-mail: marieporchet@bluewin.ch

(2)

Abstract

The last decade has been characterized by significant technological advances resulting in

increasingly automated weapons. Developing Autonomous Weapons Systems (AWS) can

offer compelling military advantages (lower human casualties, more precision in attacks). Considering challenges posed by AWS regarding International Humanitarian Law principles (distinction, precaution or proportionality), numerous actors started expressing concerns

about the absence of regulatory framework surrounding those weapons.

Perks and disadvantages of legal reviewing, hard law and soft law will be assessed in this

report, as propositions to regulate AWS. They are examined in light of an analysis of other

types of weapons systems’ regulatory frameworks, focusing on their negotiation, adoption

and enforcement processes. Their relevance in view of the legal regimes of state

responsibility and individual criminal liability is further analyzed. States’ compliance

towards a future framework and responsibility gaps possibly arising from the use of such weapons will significantly impact the degree of adequacy of these three propositions.

The value of reaching a consensus on definitional uncertainties before drafting a legal framework is underlined. International Humanitarian Law (IHL) principles may be impaired by AWS in certain situations but also benefit from their sophistication and reliability.

Elements from past attempts to control weapons relying on technological advances

(cluster munitions, incendiary weapons, nuclear and cyber weapons, etc.) are analyzed and

key learnings to be considered when adopting one of the three types of regulation at stake

are highlighted.

Needs for a flexible regulatory framework, considering AWS’ rapid technological evolution, and for broad acceptation of these norms by powerful states, crucial for the framework’s efficiency in preventing IHL violations, are demonstrated. Strict legal reviews of AWS before their fielding as well as full human operational control on their conduct are of critical importance. Such elements could be encompassed in soft law instruments.

Rules of state responsibility, combined with the notion of distributed responsibility within

military forces, are sufficient indicators that no liability gaps arise from the use of AWS. A strict ban of AWS through a binding instrument is not imperative. The possibility for a soft

law regulation therefore arises as the adequate alternative to efficiently prevent IHL

(3)

Abbreviations

ARSIWA: ILC Draft articles on Responsibility of States for Internationally Wrongful Acts AWS: Autonomous Weapons Systems

AI: Artificial Intelligence

CCW: Convention on Certain Conventional Weapons

GGE: Group of Governmental Experts on Lethal Autonomous Weapons Systems HRW: Human Rights Watch

HR: Human Rights

ICJ: International Court of Justice ICL: International Criminal Law

ICTR: International Criminal Tribunal for Rwanda ICTY: International Criminal Tribunal for Ex-Yugoslavia ICRC: International Committee of the Red Cross

IHL: International Humanitarian Law IHRL: International Human Rights Law NWS: Nuclear Weapons States

(4)

Table of contents

1. INTRODUCTION 5

1.1. SUB-QUESTIONS AND METHODOLOGY 6

2. AUTONOMOUS WEAPON SYSTEMS: DEFINITIONS AND EXISTING LEGAL

FRAMEWORK 7

2.1. TOWARDS A SHARED DEFINITION OF AWS: TECHNICAL AMBIGUITY 7

2.2. TOWARDS A SHARED DEFINITION OF AWS: DEFINITIONAL GAPS 8

2.3. CHALLENGES TO EXISTING LEGAL REGIMES 9

2.3.1.IUS AD BELLUM 9

2.3.2.HUMAN RIGHTS AND LAW ENFORCEMENT 9

2.3.3.INTERNATIONAL HUMANITARIAN LAW 10

2.3.3.1Distinction 11

2.3.3.2Precaution 12

2.3.3.3Proportionality 12

2.4. CONCLUDING REMARKS 13

3. COMPARATIVE STUDY: WHAT APPROPRIATE LEGAL FRAMEWORK FOR AWS

REGULATION? 15

3.1. LEGAL REVIEWING 15

3.2. REGULATION TROUGH HARD LAW: STRICT BAN OF AWS 16

3.2.1.INTRODUCTION OF NOTIONS OF HUMAN CONTROL 16

3.2.2.ADEQUACY OF HARD LAW 17

3.3. ROLE OF SOFT LAW IN REGULATING AWS 18

3.3.1.ADEQUACY OF SOFT LAW 19

3.4. LEARNING FROM THE PAST:IHL AND DEVELOPING TECHNOLOGY 20

3.4.1BLINDING LASERS AND INCENDIARY WEAPONS 20

3.4.2CLUSTER MUNITIONS 21

3.4.3ANTI-PERSONAL LANDMINES 23

3.4.4REMOTELY-CONTROLLED SYSTEMS 25

3.4.5CYBER WEAPONS 26

3.4.6NUCLEAR WEAPONS 27

3.5. CONCLUDING REMARKS 28

4. LIABILITY FOR VIOLATIONS OF IHL THROUGH THE USE OF AWS:

RESPONSIBILITY GAPS AND REMEDIES 30

4.1. STATE RESPONSIBILITY 30

4.1.2.ENFORCEMENT DIFFICULTIES IN PRACTICE 31

4.2. INDIVIDUAL CRIMINAL RESPONSIBILITY 31

4.2.1.INDIVIDUAL RESPONSIBILITY OF AWS AS MORAL AGENTS? 32

4.2.2.MILITARY COMMANDERS AND CIVILIAN SUPERIORS 32

4.2.2.1.Resolving liability gaps in command responsibility 33

4.2.2.2.Disabling mechanisms and human control 35

4.2.3.DEVELOPERS AND MANUFACTURERS 35

4.2.3.1.Aiding and abetting: responsibilization of nonstate actors 36

4.2.3.2.Resolving liability gaps as a support towards soft law 37

4.3. CONCLUDING REMARKS AND RECOMMENDATIONS 37

5. CONCLUSION 39

(5)

1. Introduction

“A year spent in artificial intelligence is enough to make one believe in God.” __ Alan Perlis

Historically, societies have always tried to develop instruments to optimize human performance, should it be for agricultural, industrial or military purposes. Our modern history is filled with examples of spectacular advances in science or technology that deeply impacted human societies. Over time, the accumulation of technical advances in robotics and automation, whether minor or groundbreaking, generated a virtuous circle of innovation. The intensification of the research work in Artificial Intelligence (AI) or even machine learning started to attract military forces’ interest.

The development of Autonomous Weapons Systems (AWS), defined as mechanisms capable of deciding, controlling and initiating an attack without human contribution, results from the weaponization of these technologies. Massive increase in investments promoting this pioneering sector have led a growing number of scientists, scholars and governmental actors to express their concerns about the emergence of such weapons.

Over the last century, States involved in armed conflicts have consistently relied on technology to “destroy more efficiently and abundantly, but also to defend people from harm”.1 The fielding of AWS by States could effectively reach such objectives, by ensuring precise and fast-paced military decisions and by reducing the amount of humans physically needed for military operations, AWS being used as substitutes for individuals in “dull, dirty and dangerous tasks”.2

When discussing the development of AWS, the debate generally focuses on two questions. First, there is moral uncertainty as to whether machines should be enabled, with limited or without human intervention, to target military objectives. Secondly, and more specifically in connection to International Humanitarian Law (IHL), it is debated whether AWS could respect the general rules of warfare, including the principles of proportionality, distinction or precaution. This report will slightly depart from these considerations to focus on a third

1 Press (2017). Of robots and rules: Autonomous weapon systems in the law of armed conflict, p. 1338

2 Weizmann (2014). Autonomous Weapon Systems under International Law, p. 4. See Boothby (2014). Conflict

Law: The Influence of New Weapons Technology, Human Rights and Emerging Actors, pp. 104-107. See also Sassoli (2014). Autonomous Weapons and International Humanitarian Law: Advantages, Open Technical Questions and Legal Issues to be Clarified, p. 310.

(6)

question: what type of legal framework could appropriately regulate AWS, in order to prevent violations of IHL throughout their use?

1.1. Sub-questions and methodology

To answer this research question, three main sub-questions will be addressed.

First, we will examine the challenges posed by the emergence of AWS to several international legal regimes, particularly regarding IHL norms, but also in connection to ius ad bellum, International Human Rights Law (IHRL) or law enforcement.

Continuing with a comparative analysis of existing weapons’ legal frameworks, our premise is that by analyzing how the international legal regimes of weapons law and targeting law have evolved in response to the weaponization of technological advances in the past, we could be able to determine what type of structure should take a future legal framework on AWS.

This report will finally determine if responsibility gaps can be encountered with the use of AWS. Both international liability and criminal responsibility regimes could indeed affect arguments for or against the above-mentioned regulation alternatives. In terms of methodology, this research will follow an evaluative approach, using comparative law to make prescriptive recommendations, and address multiple legal regimes.

(7)

2. Autonomous weapon systems: definitions and existing legal

framework

Claims in favor of inter-State talks to rule on the use of AWS have intensified, but have been impeded by the failure of the international community to agree on a shared definition of such systems and on the precise limits of existing technology.

2.1. Towards a shared definition of AWS: technical ambiguity

Actors involved in attempts to define AWS (States, non-governmental organizations, scientific associations) differ by their background and expertise, hence their “views on whether these technologies already exist diverge”.3 Starting with States, having different technological capacities, the debate extends to international organizations. While Human Rights Watch (HRW) affirms that such weapons are not yet conceivable, the International Committee of the Red Cross (ICRC) “points out that examples of existing weapons with autonomy in their critical functions already exist”.4

Nevertheless, negotiating processes have already been carried out pre-emptively, before the technology could be used by States as weapons.5 A certain degree of autonomy has been existing in weapon systems for decades, for instance in anti-personal landmines or torpedoes.6 It is also clear that certain “modern and very sophisticated autonomous (or at least very highly automated) weapon systems already exist”.7 For instance, the Samsung Techwin SGR-1 is capable of targeting and firing without human intervention.8 On another autonomy level, both the US MK 15 Phalanx Close-In Weapons System, “capable of autonomously performing its own…engage and kill assessment functions”, and the Iron dome, a semi-autonomous defense Israeli system, should be mentioned.9 Additionally, it is manifest from a

3 Ekelhof (2017). Complications of a Common Language, p. 320.

4 Idem. See International Committee of the Red Cross (2016). Autonomous Weapon Systems: Implications of

Increasing Autonomy in the Critical Functions of Weapons. See also Human Rights Watch (2014). Advancing the Debate on Killer Robots: 12 Arguments for a Preemptive Ban on Fully Autonomous Weapons, p. 1

5 See Protocol IV on Blinding Weapons to the Convention on prohibitions or restrictions on the use of certain

conventional weapons (hereinafter Protocol IV CCW). See Human Rights Watch (2015). Precedent for Preemption: The Ban on Blinding Lasers as a Model for a Killer Robots Prohibition

6 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 388. 7 Idem.

8 See Thomas (2015). Autonomous weapon systems: the anatomy of autonomy and the legality of lethality, p.

245

9 See respectively United States Navy (n.d). MK15 - Phalanx Close-In Weapon System, and Raytheon Missiles

(8)

technical perspective that “the tipping point from a highly-automated system to an “autonomous” one is very thin”.10

2.2. Towards a shared definition of AWS: definitional gaps

AWS are categorized as weapons systems, which unlike weapons include “the weapon itself and those components required for its operation”.11 Although commonly known as devices

whose level of autonomy is sufficient to enable them to select or attack targets independently of any human intervention, even such a general definition is subject to various interpretations. The terms autonomy, target selection and attack, as well as human intervention, are “fluid and pluralistic” and should be strictly delimited by States.12 This chapter will mainly focus on the term ‘autonomy’, which can be defined as self-governance, here linked to the critical functions of AWS.

Different views are voiced by scholars as to what type of system should be defined as autonomous. Some authors think that a system containing ‘automated’ elements, if pre-programed to select and attack targets without “agents to monitor its behaviour”, already qualifies as autonomous, even if not including ‘intelligence’ inputs.13 Others state that genuine autonomy is only achieved if the system can ‘learn’ with skills provided by AI,14 thus if possessing “a set of intelligence-based capabilities” providing it with the ability to “respond to situations that were not programmed”15 and to “revise its initial stock of knowledge, so as to face the challenges of its environment”.16

AWS are categorized according to their degree of autonomy and can be operating with humans on and out of the loop (also referred to as semi and fully autonomous systems). This terminological distinction is useful to distinct two levels of machine autonomy. If a weapon can be overridden by agents while performing its critical functions, humans are considered on the loop, and when no human interaction is conceivable, out of the loop.17 The upcoming

10 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 389 11 Lawand (2006). A Guide to the Legal Review of New Weapons, Means and Methods of Warfare: Measures to Implement Article 36 of Additional Protocol I of 1977, p. 937

12 See Ekelhof (2017). Complications of a Common Language, p. 322.

13 Sartor & Omicini (2016). The autonomy of technological systems and responsibilities for their use, p. 39. 14 Ekelhof (2017). Complications of a Common Language, p. 324

15 United States Department of the Air Force (2015). Autonomous Horizons – System Autonomy in the Air

Force: A Path to the Future

16 Sartor & Omicini (2016). The autonomy of technological systems and responsibilities for their use, p. 40 17Idem.

(9)

analysis will focus on various degrees of automation and independence to seize the large spectrum of challenges and questions posed by AWS.

2.3. Challenges to existing legal regimes

Developing fully or semi-autonomous weapons can have serious interests for actors engaged in warfare. Through the depersonalization of weapons, States may benefit from a decrease in human losses and physical or psychological harm caused to fighters.18 By “shortening the time between the positive identification of a target and its attack”, the use of AWS may be valuable in situations where “the faster system wins the engagement” or where manned weapons may not be sufficiently prompt to react to quick situation changes due to “highly movable” targets.19 AWS may be of particular value in “high-intensity conflicts”, by increasing reliability in data and information treatment.20 However, the emergence of AWS also results in challenges for multiple legal regimes.

2.3.1.

Ius ad bellum

This legal regime sets the conditions States must fulfill in order to lawfully use force and concerns have raised that the emergence of AWS could “prompt states to lower the threshold for using force, and consequently increase the incidence of attacks”.21 If a decision to attack was triggered by machines, it could lack sufficient prior human reasoning or political considerations for instance, which would usually discourage the resort to force.

2.3.2.

Human Rights and law enforcement

From a domestic law enforcement perspective, AWS appear particularly advantageous, for instance to control riots or large crowds, to secure borders, or to fight terrorist threats, but could also have dramatic consequences in terms of Human Rights (HR) violations.22

18 International Committee of the Red Cross (2014). Autonomous Weapon Systems: Technical, Military, Legal

and Humanitarian Aspects, pp. 9-10

19 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 390 20 International Committee of the Red Cross (2014). Autonomous Weapon Systems: Technical, Military, Legal

and Humanitarian Aspects, p. 10. See also Weizmann (2014). Autonomous Weapon Systems under International Law, p. 4

21 Weizmann (2014). Autonomous Weapon Systems under International Law, p. 9. See United Nations Human

Rights Council (2013). Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions, p. 11§58

22 See Heyns (2016). Human rights and the use of autonomous weapons systems (AWS) during domestic law

(10)

Principle 9 of the 1990 Basic Principles on the Use of Force and Firearms affirms that “lethal use of firearms may only be made when strictly unavoidable in order to protect life”.23

Moreover, law enforcement norms prescribe that any use of force is limited to specific situations like self-defense or prevention of crime and should be preceded by non-lethal force.24 AWS must therefore be able to precisely and constantly evaluate the context to determine if lives are at risk, and subsequently “select and implement alternatives to lethal force, such as negotiation and capture” if more appropriate under the principle of necessity.25 Considering technical difficulties in designing machines with such capacities, it would be particularly challenging to prevent situations where AWS could trigger HR violations, among which breaches of the right to life or security and liberty.26

2.3.3.

International Humanitarian Law

The debate concerning the legality of existing and future AWS with rules of IHL is profoundly intricate and controversial. Even if this legal regime does not expressly regulate such weapon systems, their use is undeniably subject to all customary rules of IHL.27 Two

dimensions of the law of armed conflicts must be distinguished: weapons law and targeting law.

Starting with weapons law, which focuses on the legality per se of the weapon system on its own, AWS’ potential indiscriminate nature or capacity to cause unnecessary suffering must be assessed.28 This verification must solely rely on the “nature of the weapon in the uses for

which it was designed”, not on any possible misuse that could be conceive.29 The

indiscriminate nature of weapons such as anti-personnel landmines or incendiary weapons has been ascertained in the past, based on their incapacity to clearly distinguish between

23 1990 Basic Principles of the Use of Force, principle 9

24 Weizmann (2014). Autonomous Weapon Systems under International Law, pp. 11-12. Insert definition

principle of precaution See Office of the High Commissioner for Human Rights (n.d.). Commentary of the Code of Conduct for Law Enforcement Officials 1979, commentary of art. 1. See also principle 9 of the Basic Principles on the Use of Force and Firearms by Law Enforcement Officials (hereinafter 1990 Basic Principles of the Use of Force)

25 Ibid., p. 12

26 International Covenant on Civil and Political Rights (hereinafter ICCPR) art. 6.1 and 9.1. See Heyns (2016).

Human rights and the use of autonomous weapons systems (AWS) during domestic law enforcement, pp. 362-364

27 Davison (2018). A legal perspective: Autonomous weapon systems under international humanitarian law, p. 7 28 International Committee of the Red Cross (n.d.). International Humanitarian Law database: Customary IHL,

rules 70 and 71. See Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (hereinafter AP I), art. 51(4)(b) and art. 35(2).

(11)

lawful targets and non-combatants.30 It has initially been argued that since AWS shared characteristics of autonomy with anti-personal landmines, which pre-programmed to “detonate based on physical contact”, they should also be considered illegal per se.31

However, a distinction between ‘automated weapons’ such as landmines (reacting to pre-programmed criteria) and autonomous weapons (using Artificial Intelligence tools) was further made.32 AWS relying on “sufficiently reliable and accurate data to ensure it can be aimed at a military objective”, they cannot be deemed ‘indiscriminate’.33 In a similar vein, AWS do not particularly cause unnecessary suffering, unless they were “armed with weapons and ammunitions” occasioning such harm on victims, for instance if reinforced with poisonous or chemical weapons, or if capable of launching incendiary weapons.34

The question of AWS’ compliance with targeting law will now be considered, in light of the core principles required for a lawful conduct of hostilities, namely distinction, precaution and proportionality.35

2.3.3.1 Distinction

AWS must efficiently distinguish between combatants and protected persons or objects. In light of the considerable progress of AI, facial recognition, radar systems or thermal signature, AWS using such technology may not raise issues of distinction, and may even be more reliable than humans in identifying “which persons belong to which side of the equation” hence contributing to a reduction in unlawful targeting.36

Contextual and geographical factors are decisive in determining whether AWS could compromise the principle of distinction, since if operating “in remote regions, such as underwater, deserts, or areas like the Demilitarized Zone in Korea”, their sensors would presumably correctly identify lawful targets.37 However, considering features of modern warfare, for instance when “insurgents fight among the civilian population” or in highly populated areas, distinguishing lawful targets from protected persons is considered

30 See International Committee of the Red Cross (n.d.). International Humanitarian Law database: Customary

IHL, rule 71

31 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 388 32 See Van Den Boogaard (2015). Proportionality and Autonomous Weapons Systems, p. 253

33 Thurnher (2013). The Law that Applies to Autonomous Weapon Systems

34 See International Committee of the Red Cross (n.d.). International Humanitarian Law database: Customary

IHL, rule 70

35 Ibid., rules 1, 14 and 15

36 Van Den Boogaard (2015). Proportionality and Autonomous Weapons Systems, pp. 261-262 37 Thurnher (2013). The Law that Applies to Autonomous Weapon Systems

(12)

challenging even for human agents.38 Similarly, if confronted to a rightful target who decided to surrender or became hors de combat, AWS must be able to detect a change in their status.39 The lack of human interpretation of opponent’s emotions and gestures could entail

IHL violations, especially in situations where a subtle sign can radically change an individual’s fate.

2.3.3.2 Precaution

The principle of precaution contains an obligation to verify if “the objectives to be attacked are lawful military objectives”.40 Such verification must be done by generally doing ‘everything feasible’ and using all information available. It also includes an obligation for armed forces to “take all feasible precautionary measures in order to avoid, and in any event minimize, civilian casualties” when choosing their means and methods of warfare.41 The feasibility of the measures here is viewed as what is “practicable or practically possible taking into account all circumstances ruling at the time”.42 Although initially “addressed to…the human being” operating weapon systems, this duty of necessary precaution becomes an ‘obligation’ for the proceeding AWS itself since they can independently select targets.43 This duty “inherently [implies] a value judgment about whether all feasible steps have been taken”, as well as precise evaluations of contextual or strategic changes.44 AWS thus cannot only rely on pre-programmed plans and needs to adapt promptly to any modification in external circumstances that would imply a change in means and methods of warfare.

2.3.3.3 Proportionality

The proportionality requirement reflects the prohibition to pursue on attacks, if they can be expected to cause collateral damages to civilians or civilian objects, and “would be excessive in relation to the concrete and direct military advantage anticipated”.45 Such evaluation relies on whether “a reasonably well-informed person…could have expected excessive civilian

38 Van Den Boogaard (2015). Proportionality and Autonomous Weapons Systems, p. 263

39 Boothby (2014). Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging

Actors, p. 149

40 Art. 57(2) AP I

41 Van Den Boogaard (2015). Proportionality and Autonomous Weapons Systems, p. 259. See also art. 57 AP I 42 Amended Protocol II on Prohibitions or Restrictions on the Use of Mines, Booby-Traps and Other Devices to

the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons (hereinafter Amended Protocol II CCW), art. 3(10)

43 Van Den Boogaard (2015). Proportionality and Autonomous Weapons Systems, p. 279 44 Thurnher (2013). The Law that Applies to Autonomous Weapon Systems

45 International Committee of the Red Cross (n.d.). International Humanitarian Law database: Customary IHL,

(13)

casualties to result from the attack”,46 in light of a potential concrete and “sufficiently tangible” expected military advantage.47

Conceptualizing the assessment of such military advantage into a machine is extremely complex, since it represents “a process riddled with inevitably subjective value judgements”.48 The uncertainty surrounding the legal definitions of most factors influencing the evaluation of expected military advantage49 leads to difficulties in rendering such assessment objective. Indeed, humans or machines making this assessment must take into account rapid changes in circumstances, military strategies or adversary forces and adversary’s plans.50 AWS, to be capable of making proportionality assessments, must be continuously kept up to date with new military objectives and strategies51 in order to “handle the infinite number of scenarios it might face”.52

2.4. Concluding remarks

- Highly autonomous systems with limited need for human intervention already exist. They will certainly be perfectionated in the future and reach increasingly high degrees of autonomy.

- The multiplicity of actors involved in the process of discussing AWS legality results in definitional and technical ambiguities. Consensus on terms like ‘autonomy’, ‘target selection/attack’ and ‘human intervention’ must be achieved.

- Degrees of autonomy must be acknowledged, from machines with humans remaining on the loop to systems with ‘machine learning’ capacities.

- AWS’ development may facilitate States’ decisions to resort to force. It can also affect Human Rights if used domestically without non-lethal force or contextual adaptation capabilities.

- AWS challenge IHL principles, although by nature indiscriminate and not causing unnecessary suffering:

46 ICTY, Prosecutor v. Stanislav Galič, Judgment, §58

47 Van Den Boogaard (2015). Proportionality and Autonomous Weapons Systems, p. 265 48 Sassoli (2014). Autonomous Weapons and International Humanitarian Law, p. 331 49 See Van Den Boogaard (2015). Proportionality and Autonomous Weapons Systems, p. 261 50 See Sassoli (2014). Autonomous Weapons and International Humanitarian Law, p. 331 51 Ibid., p. 332

(14)

o The principle of distinction could be respected with the use of sophisticated behaviour recognition devices; limitation could be the interpretation human emotions by machines.

o The precaution principle would require AWS to adapt rapidly to changes in the attack context.

o The proportionality principle will similarly require AWS to gather sufficiently reliable information to make adequate assessments.

(15)

3. Comparative study: what appropriate legal framework for

AWS regulation?

The international community seems divided regarding the form a future legal framework regulating AWS should take. Different regulation proposition will be considered here, prior to a comparative analysis of other weapon systems’ legal frameworks.

3.1. Legal reviewing

Of particular importance for AWS’s compliance with IHL is the obligation for States under art. 36 of the 1977 Additional Protocol I to the Geneva Conventions to review the legality of means and methods of warfare before considering their acquisition, development or use in the field.53 This obligation begins as soon as States engage in the process of incorporating technology into a weapon system and continues when the weapon is being tested and fielded.54 However, the legality of AWS cannot merely rely on the result of legal reviews, since most issues invoked with their development clearly “go far beyond the scope of Article 36” and remain unpredictable.55

Moreover, States keep certain latitude to achieve such legal reviews and tend to keep the results of their research confidential,56 especially when their findings could generate “an imbalance of military strength vis-à-vis the enemy precisely by means of superior technology in the form of new weapons”.57 Such lack of transparency is exacerbated by the absence of global cooperation initiating discussions or exchanges of information between scientific, legal and military protagonists.

Even though legal reviewing is required from all states and stands as a “vitally important method whereby states should ensure their continued compliance with weapons law rules”,58 art. 36 is insufficient to guarantee the absence of violation. Consequently, this approach will

53 Art. 36 AP I

54 See Boothby (2014). Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging

Actors, p. 18

55 Herby (2016). The humanitarian perspective: from field to policy to law, p. 46 56 Boothby (2016). The regulations of weapons under IHL, p. 39

57 International Committee of the Red Cross (1987). Commentary of the art. 36 1977 Additional Protocol I to

the Geneva Conventions of 1949, §1470

(16)

remain “complementary and mutually reinforcing of multilateral discussions on autonomous weapon systems”.59 This report will thus focus on two other options: hard law and soft law.

3.2. Regulation trough hard law: strict ban of AWS

3.2.1.

Introduction of notions of human control

HRW is opposed to the idea that AWS can benefit from their lack of “human emotions” (such as fear, stress or desire for revenge) and argues that the use of these weapons should be restrained by making “explicit the requirements for compliance” or by the stigmatizing breaching States.60 This opinion is upheld by the organization ‘Article 36’, which initiated the notion of ‘meaningful human control’ as a proposition to be included in a future regulation of AWS. This idea was to maintain human control over any use of force by machines, as “many of the concerns raised by fully autonomous weapons are attributable to the absence of such control”, notably the risk of impairing human dignity.61 The ICRC also encouraged the introduction of binding rules guaranteeing human control over such weapons and emphasizes on the importance of retaining reasonable control over AWS’ critical functions.62

Various terminologies are used by the different actors claiming that the notion of human control should be key to an AWS’ framework, for e.g. ‘meaningful human control’, ‘intelligent partnership’ or ‘appropriate levels of human judgment’.63 Such terminological diversity reflects the divergence in the contributor’s interests, since their “choice for one formulation over another seems to be partially based upon existing policy”.64 Diverging views regarding the core elements of a possible ban could lead to a deadlock, since a binding

59 Giacca (2016). Legal reviews of new weapons: process and procedure, p. 76

60 See Human Rights Watch (2014). Advancing the Debate on Killer Robots: 12 Arguments for a Preemptive

Ban on Fully Autonomous Weapons, pp. 10-12 + p. 3. See also p. 22

61 See ibid., pp. 2-3. See Article 36 (2013). Killer Robots: UK Government Policy on Fully Autonomous

Weapons

62 Davison (2018). A legal perspective: Autonomous weapon systems under international humanitarian law, p.

10 See also International Committee of the Red Cross (2014). Autonomous Weapon Systems: Technical, Military, Legal and Humanitarian Aspects, p. 8. See also International Committee of the Red Cross (2016). Autonomous Weapon Systems: Implications of Increasing Autonomy in the Critical Functions of Weapons, p. 83.

63 See Ekelhof (2017). Complications of a Common Language: Why it is so Hard to Talk about Autonomous

Weapons, p. 328. See also Campaign to Stop Killer Robots (n.d.). The threat of fully autonomous weapons.

(17)

legal instrument must be adopted by consensus if negotiated under the framework of the Convention on Certain Conventional Weapons (CCW).65

The CCW, primarily conceived to prevent unnecessary suffering and indiscriminate effect that conventional weapons could induce, appears to be an interesting platform for future regulation of AWS. This idea was confirmed by the work of a Group of Governmental Experts (GGE) on “emerging technologies in the area of lethal autonomous weapons systems in the context of the objectives and purposes of CCW”, which encountered a large participation of States and organizations.66 Even if divergences in participants’ views regarding the details of a potential requirement of human control were observed, its importance was restated in the GGE 2019 final report, in which all representatives agreed that a “human judgement” was essential to maintain a “responsible chain of command” for any use of AWS.67

3.2.2.

Adequacy of hard law

Before exposing arguments in favor of non-binding norms, it is worth mentioning situations in which hard law would have more relevance than softer instruments.

First, Williamson finds that hard law will be more suitable to ensure compliance when the “political risk of being labeled a lawbreaker is higher than the political risk of engaging in legal but disfavored conduct”.68 Applied to the AWS context, this finding seems to support the idea of a binding instrument. Indeed, ethical challenges posed by these weapons, for e.g. the fear of losing control over their lethal application, or the uncertainty surrounding the future technological advances in this field, which will most certainly exceed the current expectations, has progressively provoked public outcry. However, States will unlikely refrain from using fully autonomous weapons if such technology is available to them and is not formally prohibited under IL, regardless of the social condemnation that may arise from its use.

65 Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be

Deemed to be Excessively Injurious or to Have Indiscriminate Effects. See Ekelhof (2017). Complications of a Common Language, p. 329. See also Campaign to Stop Killer Robots (2013). Convention on Conventional Weapons and Fully Autonomous Weapons: Background paper, p. 2

66 Gill (2018). The Role of the United Nations in Addressing Emerging Technologies in the Area of Lethal

Autonomous Weapon Systems

67 GGE (2019). Reports of the 2019 session of the Group of Governmental Experts on Emerging Technologies in

the Area of Lethal Autonomous Weapons Systems, p. 4

68 Williamson - Hard Law, Soft Law, and Non-Law in Multilateral Arms Control/ Some Compliance

(18)

Second, hard law appears more appropriate in situations where a treaty could enable States to “impose duties on nonstate actors”.69 States should take into account the threat posed by

nonstate actors if they were to equip themselves with lethal AWS. The military advantage gained by armed groups would change the dynamic of non-international armed conflicts, particularly if AWS were to be acquired by terrorist groups. Technology companies, particularly if developing software, hardware or deep learning technology, could also be impacted by such regulation, considering they could be held responsible for their participation in pre-programming and manufacturing AWS, if the latter were to commit crimes (see chapter IV of the present report).

Third, hard law is preferred where “effective compliance requires intrusive verification…that can be achieved only through a treaty instrument”.70 Verification through soft law is limited in the arms control regime. For an international organism to conduct such inspection on AWS, hard law would be advisable considering the ‘dual use’ capacity of these weapons, which can “be used for either peaceful or nonpeaceful purposes”, hence requiring extended evidences to ascertain their “nonpeaceful intent”.71

3.3. Role of soft law in regulating AWS

Considering that AWS will have revolutionary impacts on military advantages in warfare and given the degree of uncertainty affiliated with the development of AI or robotics, regulating such systems with hard law instruments could be counterproductive.

Soft law is a faster and simpler alternative, non-binding but “nevertheless declaratory of aspirational norms of international behavior”, of particular relevance when applied to the domain of weapon systems relying on new technologies.72 Soft law instruments such as “informal guidance…best-practice guidance, codes of conduct” have recently expanded in relation to arm regulation.73

69 Idem.

70 Ibid., pp. 71-73 71 Ibid., p. 71-72 72 Ibid., p.63

73 Eggers & al. (2018). The future of regulation: Principles for regulating emerging technologies, p. 11 See also

(19)

3.3.1.

Adequacy of soft law

The sense of hard law superiority reflected in the previous sub-chapter can be balanced. Firstly, soft law mechanisms will be favored if they incorporate and interpret pre-existing rules.74 An AWS’ code of conduct would develop new legal directives, for instance by requiring weapons to include “human override capability” or “built-in self-neutralization mechanism”.75 But it will also rely on pre-existing norms, mostly core principles of IHL, which have an undeniable customary nature, thus binding for all parties to a conflict.

Secondly, a similar effect can be observed when the legal instrument is “linked with international endeavors having a very high degree of popular support”.76 Society’s growing concern regarding AI progresses could indicate that compliance could be ensured by soft law. Thirdly, soft law also has an appreciable impact when “only a small group of states is needed to achieve substantial compliance”.77 Since only a limited number of States will have the financial capacity to acquire AWS, soft regulations could encounter a prompt success if accepted by this category of States. However, this argument is less compelling, as it would undermine the influence of smaller nations during the negotiations, which could be problematic for their compliance in a distant future, when AWS could become more affordable.

Additionally, soft law offers more flexibility and allows for “more precisely tailor restrictions to the evolving state of fully autonomous weapons technology”.78 The transition from a “highly-automated system” to a “fully autonomous system” is relatively fluent, thus potentially impairing the conception of similar systems with “numerous potential advantages”.79 If a strict ban on AWS was adopted, certain systems easily transformable into fully autonomy weapons, although not prohibited under the ban, could be considered as threats. The prohibition of systems merely because of their technical proximity with weapons forbidden by hard law could be avoided with soft law instruments.

74 Williamson (2003). Hard law, soft law, and non-law in multilateral arms control: some compliance

hypotheses, pp. 76-78

75 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 407 76 Williamson (2003). Hard law, soft law, and non-law in multilateral arms control: some compliance

hypotheses, pp. 76-78

77 Idem.

78 See Human Rights Watch (2014). Advancing the Debate on Killer Robots: 12 Arguments for a Preemptive

Ban on Fully Autonomous Weapons, p. 22

(20)

AI scientists warn that AWS will soon be deeply transformed by the development of machine’s ‘deep learning’ capacities, referring to mechanisms “whereby the system ‘learns’ how to learn”, for instance through its own formulation of new questions it will then process alone.80 Machine learning will change the way “systems learn tasks and improve their performance through experience”, thus leading to considerable improvements in “pattern recognition” (including target recognition) but also to an increase in those systems’ degree of unpredictability.81 These considerations confirm that flexibility will be a key notion when drafting a legal instrument.82

3.4. Learning from the past: IHL and developing technology

This section will further discuss and compare the strengths and constraints of both hard and soft law options, through the historical and legal study of other technological advances that, when applied to arms or weapon systems, radically changed the functioning of warfare.

3.4.1 Blinding lasers and incendiary weapons

In 1995, the use of lasers weapons was restrained by the Protocol IV to the CCW, which prohibited “laser weapons specifically designed, as their sole combat function or as one of their combat functions, to cause permanent blindness to unenhanced vision”.83 States parties to the CCW adopted this ban on blinding laser weapons by consensus, due to the unnecessary suffering caused by these lasers to combatants, but mainly because of the “long-term impact on society of increased numbers of blind veterans”.84 The legal loopholes contained in this Protocol have been used by certain States to circumvent its provisions, notably because of the uncertainty of its language, which “may be interpreted to undermine the principles upon which it is based and the spirit behind its prohibitions”.85 Indeed, the use of certain laser

weapons which are not strictly banned under art. 1, but which may cause “incidental or collateral effect of the legitimate military employment of laser systems” according to art. 3 Protocol IV, would remain lawful under IL.86

80 Boulanin & Verbruggen (2017). Mapping the Development of Autonomy in Weapon Systems, p. 17 81 Ibid., p. 114

82 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 396 83 Protocol IV on Blinding Weapons to the Convention on prohibitions or restrictions on the use of certain

conventional weapons (hereinafter Protocol IV CCW), art. 1

84 Carnahan & Robertson (1996). The Protocol on ‘Blinding Laser Weapons’: a new direction for international

humanitarian law, p. 483

85 Peters (1996). Blinding laser weapons: new limits on the technology of warfare, p. 735 86 Art. 3 Protocol IV CCW

(21)

Such dual use of a weapon is also a characteristic of AWS. Therefore, even though Protocol IV on blinding laser weapons is extensively invoked as the “leading example of a ban on a technologically advanced weapon that had not yet been deployed”, its whole purpose can be impaired by States.87 It can be argued that soft law, by allowing more flexibility, could have prevented such a misuse of laser devices. Although it should be kept in mind that soft law generally cannot “fill the gaps that exist in hard law”, a negotiation process seeking at the adoption of additional soft law norms could reinforce the incomplete nature of Protocol IV by providing more adaptability to the concrete challenges posed by laser devises.88

A policy of misusing weapons to elude binding restrictions can also be observed in the case of incendiary weapons, more specifically with white phosphorous munitions. By using a “chemical substance that ignites when exposed to atmospheric oxygen”,89 such munitions can provide valuable military advantages but also cause “immediate burns…multi-organ failure and central nervous system injury” to human beings exposed to white phosphorus.90 Despite the dramatic consequences of these munitions on public health, they are not covered by the Protocol III to the CCW on Prohibitions or Restrictions on the Use of Incendiary Weapons, since they are not “primarily designed to set fire to objects or to cause burn injury to persons”.91

Consideration should be given to these findings during the process of drafting regulation for AWS, since it illustrates the challenges posed by hard law enforcement.

3.4.2 Cluster munitions

There are great deals of lessons to be learned from the 2008 Convention on Cluster Munitions. This treaty introduced a general prohibition for States parties to “develop, produce, otherwise acquire…or transfer to anyone, directly or indirectly, cluster munitions”.92 Cluster munitions are defined as devices “designed to disperse or release explosive submunitions”.93 Art. 2(2) of the Convention then enumerates what types of

87 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 399 88 Boothby (2016). The regulations of weapons under IHL, p. 40

89 Human Rights Watch (2014). An Overdue Review: Addressing Incendiary Weapons in the Contemporary

Context, p. 4

90 Centers for Disease Control and Prevention (n.d.). White phosphorous: Systemic Agent

91 Protocol III on Prohibitions or Restrictions on the Use of Incendiary Weapons to the Convention on

prohibitions or restrictions on the use of certain conventional weapons (hereinafter Protocol III CCW), art. 1(1)

92 Convention on Cluster Munitions, art. 1(1)(b) 93 Ibid., art. 1(1) + 2(2)

(22)

munitions and submunitions shall remain out of the scope of the prohibition. This results from an innovative approach adopted by negotiating States, often called the ‘Oslo process’, whereby negotiations “started with a wide-ranging ban and then put it to those arguing for exclusions to justify why anything within that definition did not cause ‘unacceptable harm’”, in order to avoid time-consuming discussions on what must be banned.94 However, this policy led to an overloaded definition, with “no less than 20 sets of square brackets indicating competing views”, even though the technical and humanitarian impact of cluster munitions were abundantly documented.95 The uncertainty surrounding the future progresses regarding AWS is such that an ‘Oslo process’ type of negotiation for a ban would likely fail.96

Another valuable lesson that can be drawn from the Convention on Cluster Munitions is its failure to rally most financially and militarily powerful States, such as the United States, China, India or Russia, who rejected the text of the 2008 Convention, while indicating that “they would be willing to negotiate a set of regulations under the Convention on Certain Conventional Weapons”.97 Since the reluctance of powerful States to join a legal framework on AWS could be dramatic in terms of international compliance, turning to soft law may “result in higher initial buy-in” by building “on an existing set of regulations and norms rather than to create one from scratch”.98

Finally, even if the dramatic effects of these munitions for civilians was eventually acknowledged99 they were first largely denied, some States relying on the fact that cluster

munitions “could be lawfully launched on a military target alone in an otherwise unpopulated desert” to justify their rejection of the 2008 Convention.100 Linking this experience to the case of AWS is delicate. On one hand, some argue that geographical or temporal restrictions of AWS use to situations where the core principles of IHL would be easily respected “should not be used to legitimize weapons” or “stand in the way of an international prohibition”.101 This approach risks to lead to a higher tolerance threshold vis-à-vis civilian casualties, as

94 Gow & al. (2019). Routledge handbook of war, law and technology, p. 55

95 Ekelhof (2017). Complications of a Common Language, p. 318. See Borrie (2009). Unacceptable Harm: A

History of How the Treaty to Ban Cluster Munitions Was Won, pp. 201-202

96 Ibid., p. 319

97 Lewis (2015). The Case for Regulating Fully Autonomous Weapons p. 1318. See Convention on Cluster

Munitions (n.d.). States Parties and Signatories by region

98 Ibid., pp. 1318-1319

99 See Geneva Academy of International Humanitarian Law and Human Rights (n.d.). Cluster Munitions 100 See Human Rights Watch (2014). Advancing the Debate on Killer Robots: 12 Arguments for a Preemptive

Ban on Fully Autonomous Weapons, p. 9

(23)

substantiated by the example of cluster munitions, which had dramatic consequences on civilians for decades due to their possible ‘lawful use’ in certain situations. On the other hand, a certain distinction in technical capabilities between cluster munitions and AWS must be acknowledged. The latter would not pose problems in terms of remnants of war (which was one of the main problems linked to cluster munitions) and are undeniably more advanced than cluster munitions, for instance in their recognition and distinction capacity.102

3.4.3 Anti-personal landmines

Supporters of the ‘hard law’ perspective seem to have neglected an important element: when doubts arise as to a weapon’s capacity to “produce a desired tactical effect analogous to that of a human actor”, a ban of that weapon can only realistically be achieved when the military and “tactical benefit produced…is widely perceived by military leaders as insufficient to justify the risk of injury to civilians”.103 This outcome is underlined by the process of conventional regulation of anti-personal landmines. Defined as “mines designed to be exploded by the presence, proximity or contact of a person…that will incapacitate, injure or kill one or more persons” by blast or fragmentation mechanisms, anti-personnel landmines are prohibited under the 1997 Ottawa Convention.104 Their use is similarly restricted by the Protocol II to the CCW on Mines, Booby-Traps and Other Devices, as well as its amended version.105 Based on the successful introduction of multiple binding instruments directly related to the use of landmines, it could be argued that AWS should similarly rely on an extensive conventional framework. Indeed, the two systems share similar characteristics, notably their capacity to detonate/attack alone (although the mechanisms used by landmines are rudimentary and merely automated).106

However, this view is strongly undermined if the reasons that led States parties to ratify instruments banning anti-personnel landmines are taken into consideration. Following a field and evidence-based method, actors promoting a strict restriction of landmines have been able to demonstrate their dramatic long-term impact on civilians’ and soldiers’ lives and health. Landmines were not only causing more harm and injuries within the civilian population than

102 See Protocol V on Explosive Remnants of War to the Convention on prohibitions or restrictions on the use of

certain conventional weapons (hereinafter Protocol V CCW)

103 Corn (2016). Autonomous weapons systems: managing the inevitability of ‘taking the man out of the loop’,

p.213

104 Convention on the Prohibition of the Use, Stockpiling, Production and Transfer of Anti-personnel Mines and

on their Destruction (Ottawa Treaty), art. 2.1

105 See Amended Protocol II CCW

(24)

amongst soldiers, but also resulting in “more casualties…after conflicts end than during hostilities”, therefore placing a serious burden on the State treating the victims, by overloading its medical personnel and facilities, and by generating extensive costs for societies to properly reintegrate victims.107 Even though these elements can seem to be leading motives for a treaty on anti-personnel landmines, the fact that such humanitarian considerations would likely have been insufficient for States to ratify a strict ban must be recognized. Indeed, the introduction of a regulatory framework was mainly based on the fact that “the tactical benefit produced by anti-personnel landmines [was]…insufficient to justify the risk of injury to civilians”.108

Accordingly, the “attenuation between conflict regulation and strategic, operational and tactical realities” was relatively unproblematic in the case of landmines, but could impair attempts to ban AWS, considering that their “potential tactical and operational value inherent in these weapons” is significantly higher.109 If weapons regulations were to purely follow humanitarian considerations and progressed “in a manner that may not adequately account for the legitimate interests of armed forces”, it could entail States’ pernicious lack of commitment to those rules.110

This risk is similarly reflected in the example of anti-personnel landmines. Some of the most militarily powerful States refused to ratify the Ottawa Convention, relying on military necessity, but have “generally complied with the more modest regulations of the Amended Protocol”.111 A comparable situation will likely arise with regard to AWS, considering the valuable military and strategic advantages of these future weapons. Following this example, the development of soft regulations as a first step should be encouraged. The creation process of customary rules, though a combination of opinio juris and State practice is essentially sustaining this idea. If sufficiently accepted by States, even soft law instruments could turn

107 Herby (2016). The humanitarian perspective: from field to policy to law, p. 43

108 Corn (2016). Autonomous weapons systems: managing the inevitability of ‘taking the man out of the loop’,

p.213. See Human Rights Watch (1997). Retired generals renew call for total antipersonnel mine ban

109 Ibid., p. 217 110 Ibid., p. 218

111 Lewis (2015). The Case for Regulating Fully Autonomous Weapons pp. 1317-1318. See Capece (1999). The

Ottawa Treaty and Its Impact on U.S. Military Policy and Planning, pp. 183-184 + 189. See also Bryden (2013). International law, politics, and inhumane weapons : the effectiveness of global landmine regime, p. 85

(25)

into customary law, enhancing the possibility for those regulations to eventually “ripen into “hard law” treaties”.112

Finally, the efficiency of legal reviewing as a mean of regulation is also undermined by the example of landmines. Even though the dramatic consequences mines had on victims’ health and bodies were notorious and addressed by legal reviews, they were used for an extensive time before any rule was set to limit their use.113 It hence raises the question whether a legal review under art. 36 would be effective if conducted on AWS, since these new weapons are composed of extremely technical and sometimes unpredictable mechanisms, and thus will require particularly strict and formal review protocols.114

3.4.4 Remotely-controlled systems

Remotely-controlled systems are essential to our debate, since they share characteristics with AWS and generate similar challenges, for instance regarding their presumed threat to the principles of distinction or proportionality. These unmanned systems and vehicles, often referred to as ‘drones’, whether aerial, overland or maritime, are distinct from fully autonomous systems, since they operate independently but are remotely controlled by humans, who have overriding capacities over the weapons incorporated on the vehicle. It is often maintained that the most problematic aspects of AWS would be resolved by ensuring humans in the loop to issue decisions over the system’s critical functions, following drones’ configuration. However, guaranteeing individuals’ control over a weapon is not an absolute safeguard against IHL violations. Humans can be considered ‘autonomous’ entities based on their “cognitive reasoning in the execution of their battlefield tasks”, which cannot be entirely controlled by their high command, since even the most perfected instruction cannot “replicate the demands of combat or all of the variables that will arise”.115 Problems posed by AWS in terms of distinction or proportionality thus remain of concern in situations where humans are exclusively in charge of target or attack functions, for instance when the civilian or military nature of a target can hardly be ascertained due to its equipment or clothing. Therefore, if strict compliance cannot be ensured from human beings, it becomes

112 Ibid., p. 1319. See Marchant, Gary & al. (2010). International Governance of Autonomous Military Robots,

p. 306

113 See Herby (2016). The humanitarian perspective: from field to policy to law, pp. 45-46 114 Id.

115 Corn (2016). Autonomous weapons systems: managing the inevitability of ‘taking the man out of the loop’,

(26)

delicate to argue that remaining human control over AWS would alleviate the risk of IHL violation through their use. Moreover, certain scholars argue that one “should remain scrupulously neutral as between human or machine, and should affirmatively reject any a priori preference for human over machine”116. This hence encourages systems that can effectively minimize harmful consequences of a conflict like AWS, “irrespective of whether the means to that end is human or machine or some combination of the two”.117

3.4.5 Cyber weapons

An international group of experts recently adopted a non-binding Manual on the International Law Applicable to Cyber Warfare, known as the Tallinn Manual, to fulfill the lack of rules directly addressing cyber conflicts. Even though this set of rules “does not set forth lex feranda”, it demonstrates that in a very short amount of time, experts, by consensus, were able to agree on a large scope of debated notions.118 This is of particular significance for our debate, since this instrument constitutes a soft law framework regulating a weapon that can first appear unpredictable and relies on constantly evolving technology. On one hand, any attempt to regulate the evolution of cyber weapons or AWS will necessitate a great level of flexibility, which can be achieved through instruments like the Tallinn Manual. In the case of AWS, the rapidity of the negotiations or drafting process is essential since major breakthroughs in the fields of robotics or AI could lead to prompt fielding of such weapons. Therefore, soft law instruments prima facie seem advantageous, since they are “based on logical and accurate transpositions of established international law” into the AWS realm and require a shorter process than for hard law treaties to be adopted.119 On the other hand,

compliance with non-binding codes and manuals can be difficult to achieve.

It is worth mentioning that hard law is “the route that domestic jurisdictions are increasingly pursuing to deal with cyber crime and to address data protection and associated issues”, even though such process is fairly unrealistic on an international level considering the lack of consensus between States when it comes to cyber warfare.120 This could path the way for a domestic regulation of AWS and prevent abuses on a short term basis while proceeding with negotiations to reach an international agreement. Such approach is also a reminder that any

116 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, p. 410 117 Idem.

118 Schmitt (2013). Tallinn Manual on the International Law Applicable to Cyber Operations, p. 19

119 Boothby (2014). Conflict Law: The Influence of New Weapons Technology, Human Rights and Emerging

Actors, p. 134

(27)

normative framework on AWS will need to include domestically enforceable norms, all those rules requiring to “lie in a process of gradual international legal development that evolves as technology advances”.121

3.4.6 Nuclear weapons

The catastrophic consequences that even a narrow use of nuclear weapons could entail, in terms of health, environmental or economic damages, are well-known and acknowledged by most States. Yet, although the International Court of Justice (ICJ) concluded that “the threat or use of nuclear weapons would generally be contrary to the rules of international law applicable in armed conflict”, no international agreement has yet come into force to ensure complete nuclear disarmament.122 A nuclear balance was established within the last decades between Nuclear Weapons States (NWS), and although the severity and globality of damages that would result from an attack is recognized, nuclear proliferation is considered by many as possessing unique and unmatched military and security advantages, since it is “precisely the devastating and terrifying nature of a nuclear war that is at the very foundation of nuclear deterrence”.123

In many ways, the challenges posed to nuclear weapons’ regulating attempts can be applied to AWS. First, two elements are considered to be the main obstacles to a global ban on nuclear weapons: their deterrent effect, as well as their unique assets for States’ defense strategies, “which cannot easily be replaced, when there were military substitutes for landmines, cluster bombs”, etc.124 A comparable outcome is observed for AWS, since their

ban could constitute a significant loss of military advantage that would difficultly find alternatives, considering the groundbreaking nature of weapons ensuring so few human losses and still maintaining a remarkable precision capacity. Second, if using those types of weapons, States will undeniably encounter a certain amount of unpredictability, which becomes problematic in light of their lethal capacity.125

Due to the ambiguity of its binding provisions, particularly in its art. VI, which merely obliges States to “pursue negotiations in good faith on effective measures relating to

121 Anderson & al. (2014). Adapting the Law of Armed Conflict to Autonomous Weapon Systems, pp. 406-407 122 ICJ, Legality of the Threat or Use of Nuclear Weapons, Advisory Opinion, p. 44

123 Grand (2016). Nuclear weapons: IHL considerations revisited, 20 years after the ICJ Advisory Opinion, p.

212

124 Ibid., p. 216

125 See International Committee of the Red Cross (2016). Autonomous Weapon Systems: Implications of

(28)

cessation of the nuclear arms race”, the 1968 Treaty on the Non-Proliferation of Nuclear Weapons had mitigated effects.126 Another step towards the total ban of these destructive

weapons was made with the adoption of the 2017 Treaty for the Prohibition of Nuclear Weapons, pursuant to a United Nations General Assembly resolution, to this date signed by 81 states and ratified by 37.127 Although not yet in force, numerous actors believe that it could considerably benefit nuclear disarmament. However, such binding agreement will presumably encounter strong opposition from NWS and thus enforcement difficulties. Additionally, NWS refusing to sign this new treaty could benefit from other States nationally restricting their own use or stockpiling of nuclear weapons, by gaining military and political advantage towards those States (due to the important deterrent effect of such weapons). A similar result could arise from a strict ban on AWS if total acceptance of the future binding instrument was not guaranteed.

The downside effects of regulating nuclear or autonomous weapons through soft law should be born in mind. While soft law is appropriate if sufficient cooperation between States can be ensured, for instance through sharing of technical information or through more transparent legal reviews, it is very unlikely that such collaboration occurs between States. As historically illustrated by nuclear arm race, States’ military interest in keeping such information or discoveries for themselves can be considered be too valuable to share it for the ‘mere’ purpose of ensuring compliance with soft law instruments.128

3.5. Concluding remarks

- Legal reviewing, hard law and soft law have been proposed to regulate the use of AWS. - Legal reviewing is essential to ensure ‘weapons law’ compliance but does not suffice to

prevent violations of IHL by AWS.

- Considering AWS’ attractiveness for States if not strictly prohibited, hard law could be an adequate option for AWS regulation.

- Flexibility of soft law can address AI and deep learning technologies’ future development, as well as compliance requirements.

- Existing regulatory frameworks underline the perks and disadvantages of these options:

126 Treaty on the Non-Proliferation of Nuclear Weapons (NPT). See Grand (2016). Nuclear weapons: IHL

considerations revisited, 20 years after the ICJ Advisory Opinion

127 See International Campaign to Abolish Nuclear Weapons (n.d.). Signature and ratification status.

128 See Geneva Academy of International Humanitarian Law and Human Rights (2014). Autonomous Weapons

Referenties

GERELATEERDE DOCUMENTEN

For  customizable  and  reconfigurable  DSP  cores,  the  key  point  is  to  reconfigure  only  a  limited  number  of  units  within  the  DSP  core,  such 

Students perform learning activities that yield competency development (process) and particular deliverables (result), which enhances their understanding of the

Dat betekent niet dat er per se veel nieuwe kennis bij moet komen maar dat we de beschikbare kennis toepasbaar moeten maken voor gebieden, regio’s en provincies.. En wij zul- len

Hence, a higher exospheric temperature leads to lower values of the reflection coefficient, especially for the higher order

Abstract—In this paper, an efficient computational scheme is proposed to calculate the symbol mean and variance from bit a priori information, when a so-called multilinear mapping

Aan iedere fase werd één object gekoppeld en per object werd in 1 fase een fungicide Curzate M, cymoxanil 4,5% + mancozeb 64% gespoten welke een goede werking tegen Alternaria

De biologische gegevens, verzameld in dit programma, worden tevens gebruikt voor onderzoek dat wordt uitgevoerd in de open programmering rond het thema “Multifunctioneel gebruik

Additional file 1: Table S1. Accessions table including Genbank accession numbers. Phylogenetic hypotheses: best trees with bootstrap support values from RAxML analyses of