WASHINGTON, D.C.- While most media outlets and average Americans are worried about drones- both the domestic uses and their presence abroad in war zones- human rights organizations have begun warning about an international shift towards more powerful weapons. Mary Wareham, the Advocacy Director at Human Rights Watch’s (HRW) Arms Division, is a leading figure in the movement that aims to stop the creation of these new weapons. HRW is a non-governmental organization that researches and advocates for international human rights issues.
Today, Wareham works on a problem HRW believes is the possible future of war- fully autonomous weapons. Or as they are often called, “Killer Robots.” A killer robot, according to Wareham, is a “lethal autonomous weapon; the ‘lethal’ part is the ‘killer’ part and the ‘autonomous system’ is a robot.”
Wareham has been with HRW’s Arms Division since 1998. She moved from New Zealand to the U.S. in order to work on the International Campaign to Ban Landmines- which succeeded in 1997. She then began working with HRW to set up a monitoring system to make sure the international ban of landmines was being implemented appropriately.“We [New Zealand] have a long history of nuclear disarmament and protest and I grew up in that era. So it was natural to end up studying a topic like landmines and end up working in it,” Wareham said.
Wareham coordinates a campaign called “The Campaign to Stop Killer Robots” that aims to get awareness around the issue of these weapons and potentially influence world leaders. The campaign, according to their website, “is an international coalition working to preemptively ban fully autonomous weapons.” The coalition is made of nine Non-Governmental Organizations (NGOs); four national and five international.
Wareham believes that advanced weapons, such as drones, can show what the possible impact of these future weapons systems are. However, there is a key difference between drones and these new autonomous weapons systems. Drones are still controlled by a person, but these new fully autonomous weapons are not. Drones have someone making decisions behind who is a target and who is not, autonomous weapons do not.
“Killer robots evoke the image of the of the terminator robot from the Arnold Schwarzenegger movies. And when we talk to the artificial intelligence (AI) experts and roboticists they say one day, sure, we’ll be able to create a Terminator,” Wareham said. But for now, these weapons are “very basic systems with rudimentary artificial intelligence that could be quite lethal and could harm a lot of civilians. Rather than the super intelligent machines that look like they are humans.”
Meaning, unlike The Terminator, these robots do not have the capability to differentiate between a legitimate target and a civilian. And as there is no person behind the controls, the selection would be left almost completely to these robots.
Since robots still lack the capacity to make ethical decisions, Wareham said, they will likely be unable to operate in war within international law constraints. “We looked and found- after interviewing all these experts who knew the technology better than us and looking at what the law says- that for the time being, these systems will not be able to meet international humanitarian law, because they won’t be able to adequately identify the target, and [their] use of force is also questionable,” Wareham said.
“Is it acceptable for us to permit a machine to take a human life on the battlefield or policing?” That question, Wareham says, is at the forefront of the ethicality of these weapons. These weapons do not have the ability to think empathetically or ethically when it comes to selecting a target- something Wareham believes only a human soldier would have the capacity for.
The HRW report has identified approximately six countries that have begun investing heavily in researching and later developing these autonomous systems. Although the current weapons systems are not fully autonomous (they are not fully without some human control), Wareham thinks it is still problematic that the research is going in that direction.
The campaign also looked at possible use for killer robots outside the bounds of armed conflict. The possibilities included policing and border control in domestic environments. So, much like drones, if these autonomous weapons were used in armed conflict and they seemed to work well, they may end up in the streets with civilians.
Because they can be used for almost anything, these weapons systems will not all look the same. They can be used in the water, air and on the grounds. According to Wareham, the unifying factor is “the notion of human control.” More precisely, the lack of human control. And that one piece is what has frightened human rights organizations and many international states to speak up to preemptively block the development of these weapons systems.
Of the six nations who have begun investing, the U.S. is currently leading, with Russia, China, South Korea and Israel also researching. According to Wareham, the United Kingdom (U.K.) is also building “a large autonomous aircraft.” Although the U.K., and many other states, say they have no interest in developing these systems, they continue to research the possibility. On the other hand, the U.S. has the stance that there are no policies that “prohibit or encourage” the development of these systems, Wareham says . Meaning, the U.S. may not be using them, but will look into developing them.
Wareham has been able to go to United Nation’s meeting where these weapons, and their ethicality, are being debated. World leaders are trying to learn from experts to understand exactly what these weapons systems mean. However, “eventually we have to change that learning and information exchange into something that has concrete outcomes,” Wareham says.