At the United Nations, a few countries have just blocked the establishment of a treaty banning Lethal Autonomous Weapons Systems, or SALA. Killer robots can therefore continue their development before arriving on the battlefield. The scientific community is worried and the NGOs are increasing the calls to impose this treaty. What if the solution was to teach robots a moral?
You will also be interested
After a week of meeting at the office of United Nations in Geneva (Switzerland), the establishment of a new international treaty banning the use of lethal autonomous weapons was blocked by a minority of countries (Australia, South Korea, United States, Israel and Russia). In the end, on this subject, only about twenty non-binding recommendations were adopted at the end of the meeting, on the night of Friday to Saturday. It focused on renewing the current mandate of the group of governmental experts.
Germany and France simply proposed to maintain the principle of human control over the use of force, while President Emmanuel Macron had already “Categorically opposed”fully autonomous lethal weapons.
But only one treaty can protect humanity against these robots killers, and this one is far from on track. In fact, Australia, South Korea, the United States, Israel and Russia oppose any treaty proposal for the time being.
With its bulky size, the very agile Boston Dynamics Big Dog has already been the subject of numerous experiments alongside soldiers of the US Army. It is not operational due to the volume of sound emitted by its engine, but the firm has developed other quadrupedal robots that are more discreet and with astonishing performance. © Boston Dynamics
Do we need a code of ethics for killer robots?
SALAs, or killer robots, have already been experienced for years. Futura regularly discusses the prototypes of Boston Dynamics. The firm has been developing robots dedicated to war for more than twenty years. Robots that are declined in bipedal machines, quadrupeds with impressive capacities. Whether it is these particular robots, autonomous drones or armed vehicles without occupants, the arrival of a ” Terminator Programmed to independently and coldly kill predefined targets is no longer the domain of science fiction. This kind of combat robot could land on the battlefield in a few years alongside the troops, or in their place. And above all, the absence of regulations in the matter will inevitably lead to an arms race once the first models of combat robots are truly operational.
For the NGO Human Rights Watch, at the origin of the campaign Stop Killer Robots, the arrival of these weapons is dramatic. According to her, dictators or terrorists could use it quite easily and at a good price to control or exterminate populations. They could also order the machines for targeted assassinations. The NGO is not the only one to raise awareness about these killer robots. On July 18, the tenors ofuniverse de la high-tech, dont Elon Musk, warned the member countries of the United Nations against these armaments. The signatories fear that their use in conflicts will overstep the scale of human understanding.
These reactions are far from the first. Every year since 2014, this kind of message has been sent to the United Nations. In vain. From the beginnings of these technical developments, well-known personalities in the sciences, starting with Stephen Hawking, Max Tegmark, Stuart Russell or Frank Wilczek, exposed their fears against the potential danger of the AI.
If no agreement can prohibit them, one of the solutions could be to humanize, so that they behave in a “moral” way on the battlefield. It would be a question of endowing them with a code of values specific to combatants. This is also what the lieutenant-colonel of the French army recommends. Brice erbland in his work Killer robots, published by Armand Colin. It shows what could be a SALA endowed with a HE able to exercise good judgment to behave like soldiers. In other words, these autonomous combat robots should integrate an artificial ethic. We would no longer speak of SALA, but rather of SALMA (Lethal, Morally Autonomous Weapons Systems).
However, for this to be the case, all states would have to play the game. Until then, a majority of them still agree on the need to maintain human control over lethal autonomous weapons systems. This is why the vast majority of the 88 member states want a new treaty to be proposed in 2019.
What you must remember
- Several countries are hampering plans to ban autonomous killer robots.
- The scientific community, as well as the NGO Human Rights Watch, warn states against the use of these weapons which could spiral out of control.
Interested in what you just read?
.