Read the Letter Elon Musk Signed Asking World Leaders to Curb Killer Robots Musk and more than 100 leaders of AI and robotics companies warned that lethal autonomous weapons could usher in the 'third revolution in warfare.'

By Lydia Belanger

Opinions expressed by Entrepreneur contributors are their own.

Bloomberg | Getty Images

Elon Musk often warns against the perils of artificial intelligence and its ability to escape human control. He's made comments about the possibility to harness AI in order to conduct a widespread internet hack. He's equated the potential of highly intelligent robots with that of nuclear weapons, and he's called AI potentially the "biggest risk we face as a civilization."

But Musk does more than talk and tweet about these dangers. He's donated $10 million to the Future of Life Institute, where he's a scientific advisor along with Alan Alda, Nick Bostrom, Morgan Freeman and Stephen Hawking. He also sponsors OpenAI, a nonprofit group dedicated to researching how to make AI safe. Recently, a bot created by OpenAI beat human professional video game players.

Related: Elon Musk Says Mark Zuckerberg's Understanding of AI Is 'Limited' After the Facebook CEO Called His Warnings 'Irresponsible'

His latest attempt to raise awareness about the threat of AI comes in the form of a signature on a letter to a United Nations convention. Musk joined more than 100 fellow robotics and AI company leaders in expressing urgency for UN's new Group of Governmental Experts on Lethal Autonomous Weapon Systems to work swiftly. The group was formed to protect the world from the potentially devastating effects of AI-powered weapons.

Musk and his fellow signatories wrote that they feel "especially responsible" for bringing this issue to light, because the companies and initiatives they lead are the ones building the technologies "that may be repurposed to develop autonomous weapons." They also offered to provide counsel to the new UN group.

Their letter was written in response to the cancelation of the UN group's first meeting, which was postponed "due to a small number of states failing to pay their financial contributions to the UN," according to the letter. The meeting had been scheduled for Monday, Aug. 21, but will now take place in November.

Related: 20 Weird Things We've Learned About Elon Musk

Read the full letter below for more insight into why Musk and leaders of AI and robotics companies believe autonomous weapons must be kept in check.

An Open Letter to the United Nations Convention on Certain Conventional Weapons

As companies building the technologies in Artificial Intelligence and Robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm.

We warmly welcome the decision of the UN's Conference of the Convention on Certain Conventional Weapons (CCW) to establish a Group of Governmental Experts (GGE) on Lethal Autonomous Weapon Systems. Many of our researchers and engineers are eager to offer technical advice to your deliberations.

We commend the appointment of Ambassador Amandeep Singh Gill of India as chair of the GGE. We entreat the High Contracting Parties participating in the GGE to work hard at finding means to prevent an arms race in these weapons, to protect civilians from their misuse, and to avoid the destabilizing effects of these technologies.

We regret that the GGE's first meeting, which was due to start today, has been cancelled due to a small number of states failing to pay their financial contributions to the UN. We urge the High Contracting Parties therefore to double their efforts at the first meeting of the GGE now planned for November.

Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora's box is opened, it will be hard to close.

Lydia Belanger is a former associate editor at Entrepreneur. Follow her on Twitter: @LydiaBelanger.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Starting a Business

Starting a Nonprofit Business

If you have a passion for a cause, starting a nonprofit could be for you.

Business Ideas

63 Small Business Ideas to Start in 2024

We put together a list of the best, most profitable small business ideas for entrepreneurs to pursue in 2024.

Devices

The Last Pen You'll Ever Have to Buy — Never Run Out of Ink Again With the ForeverPen

The world's smallest inkless pen is durable, portable, and built to last.

Side Hustle

After This 26-Year-Old Got Hooked on ChatGPT, He Built a 'Simple' Side Hustle Around the Bot That Brings In $4,000 a Month

Dhanvin Siriam wanted to build something that made revenue from ChatGPT, and once he did, he says, "It just caught on."

Leadership

The End of Bureaucracy — How Leadership Must Evolve in the Age of Artificial Intelligence

What if bureaucracy, the very system designed to maintain order, is now the greatest obstacle to progress?

Business News

A New Hampshire City Was Named the Hottest Housing Market in the U.S. This Year. Here's the Top 10 for 2024.

Zillow released its annual lists featuring the top housing markets, small towns, coastal cities, and geographic regions. Here's a look at the top real estate markets and towns in 2024.