Read the Letter Elon Musk Signed Asking World Leaders to Curb Killer Robots Musk and more than 100 leaders of AI and robotics companies warned that lethal autonomous weapons could usher in the 'third revolution in warfare.'
Opinions expressed by Entrepreneur contributors are their own.
Elon Musk often warns against the perils of artificial intelligence and its ability to escape human control. He's made comments about the possibility to harness AI in order to conduct a widespread internet hack. He's equated the potential of highly intelligent robots with that of nuclear weapons, and he's called AI potentially the "biggest risk we face as a civilization."
If you're not concerned about AI safety, you should be. Vastly more risk than North Korea. pic.twitter.com/2z0tiid0lc
— Elon Musk (@elonmusk) August 12, 2017
But Musk does more than talk and tweet about these dangers. He's donated $10 million to the Future of Life Institute, where he's a scientific advisor along with Alan Alda, Nick Bostrom, Morgan Freeman and Stephen Hawking. He also sponsors OpenAI, a nonprofit group dedicated to researching how to make AI safe. Recently, a bot created by OpenAI beat human professional video game players.
His latest attempt to raise awareness about the threat of AI comes in the form of a signature on a letter to a United Nations convention. Musk joined more than 100 fellow robotics and AI company leaders in expressing urgency for UN's new Group of Governmental Experts on Lethal Autonomous Weapon Systems to work swiftly. The group was formed to protect the world from the potentially devastating effects of AI-powered weapons.
Musk and his fellow signatories wrote that they feel "especially responsible" for bringing this issue to light, because the companies and initiatives they lead are the ones building the technologies "that may be repurposed to develop autonomous weapons." They also offered to provide counsel to the new UN group.
Their letter was written in response to the cancelation of the UN group's first meeting, which was postponed "due to a small number of states failing to pay their financial contributions to the UN," according to the letter. The meeting had been scheduled for Monday, Aug. 21, but will now take place in November.
Related: 20 Weird Things We've Learned About Elon Musk
Read the full letter below for more insight into why Musk and leaders of AI and robotics companies believe autonomous weapons must be kept in check.
An Open Letter to the United Nations Convention on Certain Conventional Weapons
As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm.
We warmly welcome the decision of the UN's Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations.
We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE. We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.
We regret that the GGE's first meeting, which was due to start today, has been cancelled due to a small number of states failing to pay their financial contributions to the UN. We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November.
Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close.