Over 1000 high-profile artificial intelligence experts have signed an open letter calling for a ban on “offensive autonomous weapons”, warning that a military AI arms race could spell disaster for humanity.

Professor Stephen Hawking, Tesla’s Elon Musk, Google DeepMind Chief Demis Hassabis, Noam Chomsky and Apple Co-founder Steve Wozniak were among those who added their names to the document, which will be presented tomorrow in Buenos Aires at the International Joint Conference on Artificial Intelligence.

“AI technology has reached a point where the deployment of [autonomous weapons] is – practically if not legally – feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms […] Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group, we therefore believe that a military AI arms race would not be beneficial for humanity”

The letter, coordinated through the Future of Life Institute, adds that the development of lethal autonomous weapons could lead to a public backlash that would impede potentially beneficial AI research.

Elon Musk and Stephen Hawking have both previously warned of the dangers of advanced artificial intelligence. Musk said that AI is “our biggest existential threat” and “potentially more dangerous than nukes”, while Hawking said that the technology could “spell the end of the human race.” Earlier this year, Bill Gates became another big name in the tech world to offer words of caution on the subject of advanced AI. The former Microsoft CEO said he was “in the camp that is concerned about super intelligence".

In closing, the letter states, “We believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”

At a UN conference in Geneva in April discussing the future of weaponry, no country vigorously defended or argued for the development and deployment of lethal autonomous weapons systems, although the Czech Republic and Israel underlined that autonomous weapons systems may offer certain benefits. The US pursued a similar line of argument.

Currently, there is no internationally agreed definition of what constitutes a lethal autonomous weapons system. Some states already deploy defense systems - such as Israel’s Iron Dome and the US Phalanx and C-Ram - that are programmed to respond automatically to threats from incoming munitions.