UN calls on governments to curb threats posed by `killer robots`
Governments across the globe should support international talks to address the threat posed by fully autonomous robotic weapons, Human Rights Watch has said.
Washington: Governments across the globe should support international talks to address the threat posed by fully autonomous robotic weapons, Human Rights Watch has said.
Human Rights Watch and the Harvard Law School International Human Rights Clinic on October 21, 2013, issued a question-and-answer document about the legal problems posed by these weapons.
Representatives from the Campaign to Stop Killer Robots, including Human Rights Watch, will present their concerns about fully autonomous weapons at a United Nations event in New York on October 21.
"Urgent international action is needed or killer robots may evolve from a science fiction nightmare to a deadly reality," Steve Goose, arms director at Human Rights Watch, said.
"The US and every other country should support holding international talks aimed at ensuring that humans will retain control over decisions to target and use force against other humans," he added.
Fully autonomous weapons, also called "lethal autonomous robotics" or "killer robots", have not yet been developed, but technology is moving towards increasing autonomy. Such weapons would select and engage targets without further intervention by a human.
In recent months, fully autonomous weapons have gone from an obscure issue to one that is commanding the attention of many governments, international institutions, and groups around the world.
Earlier in October, Austria, Egypt, France, Pakistan, and other countries called for international talks on fully autonomous weapons during the UN General Assembly First Committee on Disarmament and International Security in New York.
France, as chair of the next meeting of the Convention on Conventional Weapons, has been consulting to solicit support for adding fully autonomous weapons to the convention`s work program.
On October 16, 272 engineers, computing and artificial intelligence experts, roboticists, and professionals from related disciplines issued a statement calling for a ban on fully autonomous weapons.
They cast doubt on the notion that robotic weapons could meet legal requirements for the use of force "given the absence of clear scientific evidence that robot weapons have, or are likely to have in the foreseeable future, the functionality required for accurate target identification, situational awareness, or decisions regarding the proportional use of force."
"We are seeing significant interest in tackling the issue of fully autonomous weapons, and now it`s time to act," Goose said, adding: "The only viable solution will be a pre-emptive ban on the development, production, and use of these weapons."