2018: Decision Time for Canada on Killer Robots

Why the federal government must say no to a dystopian future of autonomous weapons.

by | Jan 24, 2018

French Islands of New Caledonia. Wikimedia/Brewbook

The third revolution in warfare is coming fast. Unlike most revolutions we know this one is coming. What is even more unusual is that we can stop this revolution before it starts. Before anyone is injured or killed. It will take a lot of political will by many countries including Canada. Do we have the will and more importantly the courage to use it?

The first revolution in warfare was the invention of gun powder. The second revolution was the creation of the nuclear bomb. The third revolution will be the development of fully autonomous weapons. While there is no agreed technical definition yet, these would be weapons designed to select, target and use lethal force without the meaningful control or final decision of a human. The weapons will include components using artificial intelligence, robotics, sensors, massive amounts of computer coding and other advanced technologies.

In the United Nation’s Human Rights Council such weapons are known as lethal autonomous robotic systems or LARs for short. In the Convention on Certain Conventional Weapons (CCW), another UN entity, they are known as Lethal Autonomous Weapon Systems. Still others call them fully autonomous or completely autonomous weapons. As a co-founder of a global campaign to ban them I call them killer robots.

In October 2012 seven non-governmental organizations agreed to lead an international movement to oppose the development of what were then future weapons. In April 2013 the Campaign to Stop Killer Robots was launched in London. The next month the issue was first debated in the Human Rights Council in Geneva and later that year the issue was again raised in New York at the UN’s First Committee which focuses on disarmament. In November the Convention on Conventional Weapons began work on the issue. Four years later the CCW moved into more formal talks via a Group of Governmental Experts (GGE).

Given the glacial speed at which international disarmament diplomacy usually moves proceeding to the GGE level within four years is very impressive because the process at that level is expected to result in an outcome, one of which could be a new treaty. So in UN terms things moved very quickly, but that is no match for the speed of technological change in the real world. The former UN High Representative for Disarmament Affairs. Angela Kane, compared it to the Aesop tortoise and hare fable where the race is “between the ‘tortoise’ of our slowly changing legal and institutional norms and the ‘hare” of rapid technological change in the arms industry.”

We should not be complacent because in the Aesop fable the tortoise won the race. While fully autonomous weapons do not yet exist they are not a fairy tale and if allowed to exist will change warfare in ways we cannot imagine and over which we will have decreasing control. International Humanitarian Law (IHL) exists to remind parties of a conflict that they do not have an unlimited choice of methods or means of warfare. Or as the International Committee of the Red Cross puts it “even wars have limits”.

The most effective way to ensure those limits is to ban a specific weapon system because states believe that the weapon cannot comply with existing IHL, or the weapon requires new IHL, or because the weapon is inconsistent with the laws of humanity or the dictates of public conscience (Martens Clause).

Banning weapons is not common, but it is also not rare. For example, biological weapons were banned in 1972, chemical weapons in 1993, blinding laser weapons in 1995, antipersonnel landmines in 1997, cluster munitions in 2008, and in 2017 we saw the adoption of the Nuclear Weapons Prohibition Treaty. The technical elements of each of these weapons varied greatly as did the military utility, expected or perceived. However, they all have at least two things in common. Firstly, enough states agreed that it was not acceptable to kill or injure, civilians or combatants, in the way these weapons were designed to do, and secondly, a human had meaningful control over the use of each of these banned weapons.

Over the last five years there have been a long list of ethical, moral, technical, legal, military and cultural concerns raised about killer robots that it is now time to determine whether this is an acceptable weapon system regardless of the perceived or advertised benefits. Where autonomous weapons differ from those already banned is that no human will have meaningful control over the use of lethal force – no human-in-the-loop such as exists with weaponized drones nor any humans-on-the-loop as exists with some current automated defensive systems.

The Group of Governmental Experts at the CCW will engage in their second year of work to determine what steps, if any, should be taken on autonomous weapons. There are serious time constraints because of the pace of technological developments. 2018 is the year a decision needs to be made to negotiate a new legal instrument or to take no action, shrug and hope for the best. There will be many statements, numerous papers, and expert presentations but in the end in states participating have two simple questions to answer: 1) Is it acceptable to delegate to a machine or algorithm the decision of who to target and to fire upon?, and 2) Is this the future of war we want?

The Canadian led effort to ban landmines demonstrated clearly it is better to ban a weapon before it is used rather than try to ban it after it has created a global humanitarian crisis. Canada is a member state of the CCW and will be participating in the sessions in Geneva this year. As with all the other states this is decision time for Canada. We know the revolution is coming and we know we can stop it, peacefully, without death or injury. We also know that if we don’t decide or pretend we are not yet ready for such a decision that the revolution will start without us.

Does the Canadian government have the political will to say no to this dystopian future? If the government does not have the political will to ban autonomous weapons then it better have the political will to explain to its citizens, voters and taxpayers why we are allowing killer robots and the unknown changes to conflict that they will bring.

Authors

Paul Hannon
Paul Hannon is the Executive Director of Mines Action Canada (MAC), the Canadian member of the International Campaign to Ban Landmines (ICBL) which was awarded the 1997 Nobel Peace Prize, and one of the co-founders in 2013 of the global Campaign to Stop Killer Robots.