A Superpower arms race to build killer robots could wipe out humanity if left unchecked, experts fear.
The doomsday warning comes after a UN conference failed to agree a ban on Terminator-style “slaughterbots” – which are being developed by China, Russia and the US.
Major powers are investing billions to create advanced AI weapons that can hunt and strike targets with no input from controllers.
Last year a Turkish-made kamikaze drone made the world’s first autonomous kill on human targets in Libya, a UN report revealed.
But experts warn the technology is advancing so fast, governments and societies have not properly considered the dangers.
They say machines making their own decisions are prone to unpredictable and rapidly spreading errors.
These arise from codes called algorithms which even the programmers don’t always understand and cannot stop going awry.
If AI weapons in the future are armed with biological, chemical or even nuclear warheads, the results could be unintentional Armageddon.
“It is a world where the sort of unavoidable algorithmic errors that plague even tech giants like Amazon and Google can now lead to the elimination of whole cities,” warns Prof James Dawes of Macalester College.
“The world should not repeat the catastrophic mistakes of the nuclear arms race. It should not sleepwalk into dystopia.”
MIT professor Max Tegmark, co-founder of the Future of Life Institute, issued a similarly dire warning this week.
A Turkish-made Kargu-2 drone made the world's first autonomous kill without a human controller last year, a UN report said.
A Turkish-made Kargu-2 drone made the world’s first autonomous kill without a human controller last year, a UN report said.
STM
He told Wired: “The technology is developing much faster than the military-political discussion.
“And we’re heading, by default, to the worst possible outcome.”
A potential ban on so-called Lethal Automomous Weapons Systems (LAWS) was discussed last week at the UN’s five-yearly Convention On Certain Conventional Weapons.
Some of the 120 nations taking part – including Brazil, South Africa and New Zealand – argued LAWS should be restricted by treaty like landmines and some indenciaries.
A growing list of countries including France and Germany support limits on some automous weapons including those that target humans. China said it supports a narrow set of restrictions.
Other nations, including the US, Russia, India, the UK and Australia resist a ban, saying that continuing to develop killer robots is essential to avoid being at a strategic disadvantage.
'Loitering munitions' like the Israeli-made Harop can be used to devastating effect in battle.
Complete story > GO