It might seem like the realm of science-fiction, but "killer robots" - machines that autonomously decide when to use violence - are a very real possibility, warns Mary Wareham.
The use of armed drones over the past decade has dramatically changed warfare and brought new humanitarian and legal challenges. But unmanned aerial vehicles still have a human pilot—backed up by a team of people—who decides what to target and when to fire.
That human role in deciding when and what – or whom – to attack could be taken “out-of-the-loop” and handed over to machines. Rapid advances in technology are resulting in efforts to develop weapons – so-called fully autonomous weapons or “killer robots” – that would be able to operate on their own without any human supervision. And that would change the process of fighting wars, as we know it.
Nongovernmental organizations have formed a global coalition to campaign for a ban on fully autonomous weapons. In its first public statement, the Campaign to Stop Killer Robots said that, “giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology.”
In the six months since the campaign began, there has been a flurry of activity to address this threat and to fend off development of these weapons before it’s too late. The campaign’s membership has doubled, with 44 nongovernmental organisations in 21 countries signing on. There have been parliamentary questions and briefings on the topic in Belgium, France, Germany, the UK, and elsewhere. Campaigners have written letters to their political leaders, talked to their parliamentarians and policymakers, held seminars and events, briefed media, and undertaken an array of other activities aimed at raising the need to address the issue of fully autonomous weapons right now.
Most countries, including Australia, are still in the process of figuring out their policy position on fully autonomous weapons and have not spoken publicly.
But in recent months, a number of nations have expressed their views for the first time.
The United Nations Human Rights Council in Geneva met in May to consider a report by the UN expert on extrajudicial, summary or arbitrary executions that listed a range of concerns over fully autonomous weapons. The report called for a moratorium on development of the weapons until an international framework is agreed upon. More than two dozen nations spoke during the debate, expressing support for an international process to examine this topic.
At a meeting of the UN disarmament committee last week, France urged countries to “look to the future” by addressing the challenges posed by fully autonomous weapons and “the fundamental question of the place of man in the decision to use lethal force.” France said that military, legal and technical experts should consider the issue in “an appropriate disarmament forum.”
The statement was part of France’s pitch as chair of the next meeting of the Convention on Conventional Weapons to add fully autonomous weapons to the convention’s agenda of work. Another option is to create a high-level panel of experts to consider the issue, as the UN report recommended.
There seem to be no shortage of suggestions on where to take this issue and so many concerns that multiple discussions may be necessary. But governments should carve out time to consider what lies ahead.
While fighting rages in Syria and humanitarian crises sprout around the globe, it can be hard for governments to get long-term issues on their agenda. However, the technology of autonomous warfare is fast advancing. That’s why our campaign is pushing for governments to pick up and tackle the threat of fully autonomous weapons right now.