Machine Learning and Society Why Autonomous Warfare is a Bad Idea - - PowerPoint PPT Presentation

machine learning and society why autonomous warfare is a
SMART_READER_LITE
LIVE PREVIEW

Machine Learning and Society Why Autonomous Warfare is a Bad Idea - - PowerPoint PPT Presentation

Machine Learning and Society Why Autonomous Warfare is a Bad Idea Noel Sharkey University of Sheffield International Committee for Robot Arms Control Foundation for Responsible Robotics direct human control of weapons autonomous weapons


slide-1
SLIDE 1

Machine Learning and Society

slide-2
SLIDE 2

Noel Sharkey

University of Sheffield International Committee for Robot Arms Control Foundation for Responsible Robotics

Why Autonomous Warfare is a Bad Idea

slide-3
SLIDE 3

direct human control of weapons

slide-4
SLIDE 4

sensors computer motors

autonomous weapons control

input

  • utput

control

slide-5
SLIDE 5

if heat detected on one sensor rotate robot un%l both sensors detect heat then fire weapons

PROGRAM

heat sensors

robot

Animation showing a simple version of the kill decision. It is static in this pdf.

slide-6
SLIDE 6

US: autonomous X47-b

slide-7
SLIDE 7

7

UK: Taranis autonomous intercontinental combat aircraft

slide-8
SLIDE 8

8

Israel: autonomous Guardium China: Anjian air to air combat US: CRUSHER US: autonomous submarine hunting sub

slide-9
SLIDE 9

9

slide-10
SLIDE 10

10

  • II. compliance with IHL
  • III. ethical compliance
  • IV. impact on gobal security

4 major problem areas

  • I. over reliance on computer programs
slide-11
SLIDE 11

human error, human-machine interaction failures, malfunctions, communications degradation, software coding errors, enemy cyber attacks infiltration into the industrial supply chain, jamming, spoofing, decoys,

  • ther enemy countermeasures or actions, unanticipated situations on the battlefield
  • I. Possible failures (DoD 2012)
slide-12
SLIDE 12

necessitarians humanitarians

International Humanitarian Law (IHL)

12

slide-13
SLIDE 13

★ Principle of distinction ★ Principle of proportionality ★ Precaution ★ Accountability

  • II. Compliance with international humanitarian law?
slide-14
SLIDE 14
slide-15
SLIDE 15

Autonomous Harpy radar killer Made by IAI for Turkish, Korean, Chinese and Indian Armies

15

slide-16
SLIDE 16
  • III. a moral case against

(Marten’s clause)

the decision to kill should not be delegated to a machine “being killed by a machine is the ultimate human indignity”

  • Maj. Gen. Latiff

16

slide-17
SLIDE 17

17

1. profliferation 2. lowered threshold for conflict 3. continuous global battlefield

4. accelerating the pace of battle 5. unpredictable interaction

6. accidental conflict 7. cyber vulnerability 8. militarisation of the civilian world 9. automated oppression

  • 10. non-state actors
  • IV. 10 risks to global security
slide-18
SLIDE 18

defensive systems - supervised autonomy (?)

slide-19
SLIDE 19

A way forward

19

slide-20
SLIDE 20

new york meeting october 2012

slide-21
SLIDE 21

21

CCW

Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects (eg blinding laser weapons, chemical and biological weapons)

prohibition

The convention has 8ive protocols:

  • Protocol I restricts weapons with non-detectable fragments
  • Protocol II restricts landmines, booby traps
  • Protocol III restricts incendiary weapons
  • Protocol IV restricts blinding laser weapons (adopted on October 13, 1995)
  • Protocol V sets out obligations and best practice for the clearance of explosive

remnants of war, adopted on November 28, 2003 in Geneva

slide-22
SLIDE 22

IHL compliance with AWS cannot be guaranteed for the foreseeable future. The predictability of AWS to perform mission requirements cannot be guaranteed. The unpredictability of AWS in unanticipated circumstances makes weapons reviews extremely difficult or even impossible to guarantee IHL compliance. The threats to global security are unacceptably high

Autonomous Weapons Systems (AWS)

Conclusions 1

slide-23
SLIDE 23

Let us maintain meaningful human control over the application of violent force

Conclusions 2

We are at a choice point in history where the decisions we make about automating warfare will determine the future of security. Mass proliferation could see the full automation and dehumanisation of warfare What can the machine learning community do?

slide-24
SLIDE 24

thank you for listening

icrac.net

responsiblerobotics.org

@StopTheRobotWar @noelsharkey