Autonomous weapons that have the power to track and kill targets with Terminator-like efficiency aren’t just a Hollywood fantasy anymore.
Steve Wozniak, Elon Musk, Stephen Hawking and hundreds of AI and robotics researchers say the technology to build autonomous weapons that select and engage targets without human intervention is feasible within years, not decades. And we need to ban it now.
In an open letter posted on Future of Life, the big thinkers warned that pursuing AI weapons — like armed quadcopters that can search for and eliminate people meeting certain predefined criteria — will lead to the third revolution in warfare and an all-new arms race, which would be a totally bad idea.
AI weapons wouldn’t require expensive and hard-to-obtain materials, unlike nuclear weapons, so they’ll be cheaper and easier for military powers to mass-produce. In no time, terrorists and dictators will be able to obtain them on the black market and wreak all kinds of havoc, warns the group:
“Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.”
Some experts have argued that AI weapons could be a good thing. By replacing human soldiers with machines, military powers could reduce the number of casualties, but that would lower the threshold for violence, which is bad. Ultimately, AI has potential to benefit humanity, and that should be the goal of those in the field, rather than creating a military AI arms race.
Source: Future of Life