The UK is opposing international efforts to ban "lethal autonomous weapons systems" (Laws) at a week-long United Nations session in Geneva:
The meeting, chaired by a German diplomat, Michael Biontino, has also been asked to discuss questions such as: in what situations are distinctively human traits, such as fear, hate, sense of honour and dignity, compassion and love desirable in combat?, and in what situations do machines lacking emotions offer distinct advantages over human combatants? The Campaign to Stop Killer Robots, an alliance of human rights groups and concerned scientists, is calling for an international prohibition on fully autonomous weapons. Last week Human Rights Watch released a report urging the creation of a new protocol specifically aimed at outlawing Laws. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2008 have been required to remove unexploded cluster bombs. [...] The Foreign Office told the Guardian: "At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area. The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems."
Daniel Suarez wrote a book called 'Kill Decision' [wikipedia.org] about autonomous weapon systems. While a bit heavy on the action side for my taste, I found it quite entertaining.
When the U.S. finds itself subjected to targeted drone assassinations, the race is on to find those responsible. But after the drones are discovered to be autonomous — programmed to strike without direct human control — the search for the perpetrators becomes infinitely more difficult. It's a discovery that heralds in a new era of cheap, anonymous war, where the kill decision has moved from man to machine with lasting consequences for us all.
Actually, it's not unreasonable at all. IIUC it's not that there's no moral or legal responsibility, it's that tracing the person who made the specifications is quite difficult. It's like to problem of finding where the program was before the last jump based on a pointer rather than based on sequence. Or the distinction between receiving a function call and being jumped to by a "go to" statement.
Except that a drone, or its pieces, is physical evidence.
Besides, I program only with "Come From" statements you insensitive clod.
A drone is, indeed, physical evidence. So are it's pieces. But they can be difficult to trace already, and as they become commodity items they'll become even more difficult to trace. You may well be able to tell who manufactured it, with a lot more work who first bought it. Getting the second hand purchaser, or the person who stole it if a bit more difficult. And some are already hand crafted from other items (though admittedly the ones I've heard of were quite primitive, along the lines of a repurposed Roomba).