The UK is opposing international efforts to ban "lethal autonomous weapons systems" (Laws) at a week-long United Nations session in Geneva:
The meeting, chaired by a German diplomat, Michael Biontino, has also been asked to discuss questions such as: in what situations are distinctively human traits, such as fear, hate, sense of honour and dignity, compassion and love desirable in combat?, and in what situations do machines lacking emotions offer distinct advantages over human combatants?
The Campaign to Stop Killer Robots, an alliance of human rights groups and concerned scientists, is calling for an international prohibition on fully autonomous weapons.
Last week Human Rights Watch released a report urging the creation of a new protocol specifically aimed at outlawing Laws. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2008 have been required to remove unexploded cluster bombs.
[...] The Foreign Office told the Guardian: "At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area. The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems."
(Score: 3, Funny) by Thexalon on Tuesday April 14 2015, @03:29PM
We should make all our killbots have a pre-set kill limit before shutting down. That way, we can defeat them if necessary by sending wave after wave of our own men at them!
The only thing that stops a bad guy with a compiler is a good guy with a compiler.
(Score: 0) by Anonymous Coward on Tuesday April 14 2015, @05:02PM
You cannot win against killer robots because they don't laugh at you. They directly go from ignoring you (before they identified you as target) to fighting you (after they did).