Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Tuesday April 14 2015, @11:38AM   Printer-friendly
from the actually-taken-over-by-Cybermen dept.

The UK is opposing international efforts to ban "lethal autonomous weapons systems" (Laws) at a week-long United Nations session in Geneva:

The meeting, chaired by a German diplomat, Michael Biontino, has also been asked to discuss questions such as: in what situations are distinctively human traits, such as fear, hate, sense of honour and dignity, compassion and love desirable in combat?, and in what situations do machines lacking emotions offer distinct advantages over human combatants?

The Campaign to Stop Killer Robots, an alliance of human rights groups and concerned scientists, is calling for an international prohibition on fully autonomous weapons.

Last week Human Rights Watch released a report urging the creation of a new protocol specifically aimed at outlawing Laws. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2008 have been required to remove unexploded cluster bombs.

[...] The Foreign Office told the Guardian: "At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area. The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems."

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by HiThere on Tuesday April 14 2015, @08:33PM

    by HiThere (866) Subscriber Badge on Tuesday April 14 2015, @08:33PM (#170535) Journal

    Actually, it's not unreasonable at all. IIUC it's not that there's no moral or legal responsibility, it's that tracing the person who made the specifications is quite difficult. It's like to problem of finding where the program was before the last jump based on a pointer rather than based on sequence. Or the distinction between receiving a function call and being jumped to by a "go to" statement.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by frojack on Tuesday April 14 2015, @11:31PM

    by frojack (1554) on Tuesday April 14 2015, @11:31PM (#170629) Journal

    Except that a drone, or its pieces, is physical evidence.

    Besides, I program only with "Come From" statements you insensitive clod.

    --
    No, you are mistaken. I've always had this sig.
    • (Score: 2) by HiThere on Wednesday April 15 2015, @06:45PM

      by HiThere (866) Subscriber Badge on Wednesday April 15 2015, @06:45PM (#171122) Journal

      A drone is, indeed, physical evidence. So are it's pieces. But they can be difficult to trace already, and as they become commodity items they'll become even more difficult to trace. You may well be able to tell who manufactured it, with a lot more work who first bought it. Getting the second hand purchaser, or the person who stole it if a bit more difficult. And some are already hand crafted from other items (though admittedly the ones I've heard of were quite primitive, along the lines of a repurposed Roomba).

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.