Welcome to the KDC Resource Future of Defence series where we'll discuss new technologies and their potential impact on the defence industry and society.
Elon Musk and other high profile names from the technology world have signed an open letter to the United Nations, warning of the dangers of “killer robots”. The group, comprised of Musk and over 100 experts in robotics and artificial intelligence, has called on the UN to stop a dangerous high-tech arms race breaking out.
The dangers of AI have been pushed to the forefront of tech conversations recently as AI develops for uses such as autonomous vehicles, and through the public spat between Mark Zuckerberg and Musk. Zuckerberg plays down the dangers of AI while Musk claims that AI is the “biggest existential threat” to mankind.
The letter was addressed to the UN Convention on Certain Conventional Weapons and demands that action be taken to “prevent an arms race” in lethal autonomous weapons.
The ethics of modern warfare currently break down along the following lines - technology such as guided missiles and drone strikes are permissible because they still require a significant amount of human intervention - for instance, to take a shoot/don’t shoot decision by pressing a button. Although humans would develop and train the algorithms and constraints that dictate behaviour for battlefield AI brains and their robotic bodies, the human would be taken out of the loop for each kill/don’t kill decision. Failure to properly address edge cases, poor security practices, rogue agents or even simple bugs could result in wildly unpredictable outcomes that spin out of human control.
As the letter notes, "Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways," it said.
However, one could argue that this may lead to a “better” warfare - sophisticated systems that learn over time to avoid civilian casualties, win wars quickly or cripple infrastructure with minimal casualties might actually be an improvement on current practices.
The letter claims that "We do not have long to act. Once this Pandora's box is opened, it will be hard to close."
A UN group that is looking into this decision will next meet in November.