Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Tuesday April 14 2015, @11:38AM   Printer-friendly
from the actually-taken-over-by-Cybermen dept.

The UK is opposing international efforts to ban "lethal autonomous weapons systems" (Laws) at a week-long United Nations session in Geneva:

The meeting, chaired by a German diplomat, Michael Biontino, has also been asked to discuss questions such as: in what situations are distinctively human traits, such as fear, hate, sense of honour and dignity, compassion and love desirable in combat?, and in what situations do machines lacking emotions offer distinct advantages over human combatants?

The Campaign to Stop Killer Robots, an alliance of human rights groups and concerned scientists, is calling for an international prohibition on fully autonomous weapons.

Last week Human Rights Watch released a report urging the creation of a new protocol specifically aimed at outlawing Laws. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2008 have been required to remove unexploded cluster bombs.

[...] The Foreign Office told the Guardian: "At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area. The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems."

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by sudo rm -rf on Tuesday April 14 2015, @03:25PM

    by sudo rm -rf (2357) on Tuesday April 14 2015, @03:25PM (#170422) Journal

    Daniel Suarez wrote a book called 'Kill Decision' [wikipedia.org] about autonomous weapon systems. While a bit heavy on the action side for my taste, I found it quite entertaining.

    When the U.S. finds itself subjected to targeted drone assassinations, the race is on to find those responsible. But after the drones are discovered to be autonomous — programmed to strike without direct human control — the search for the perpetrators becomes infinitely more difficult. It's a discovery that heralds in a new era of cheap, anonymous war, where the kill decision has moved from man to machine with lasting consequences for us all.

    [Blurb]

    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 2) by TheRaven on Tuesday April 14 2015, @04:22PM

    by TheRaven (270) on Tuesday April 14 2015, @04:22PM (#170440) Journal
    Sounds a bit odd. The human is still responsible for defining targets (or, at least, valid parameters for targets), so you still have a human in the loop. It's not like you tell the drones 'go and kill the bad people!', you still need to define who 'the bad people' are, whether that means anyone in the current theatre of operations wearing the wrong uniform (or no uniform) or specific targets with facial recognition. It's not really different from current missiles, which decide when (and, indeed, whether) to explode based on target information (location, proximity and so on). You don't say that someone who fired a GPS-guided missile at a school is not responsible just because the missile followed an evasive trajectory and then 'decided' to explode based on the location data.
    --
    sudo mod me up
    • (Score: 2) by HiThere on Tuesday April 14 2015, @08:33PM

      by HiThere (866) Subscriber Badge on Tuesday April 14 2015, @08:33PM (#170535) Journal

      Actually, it's not unreasonable at all. IIUC it's not that there's no moral or legal responsibility, it's that tracing the person who made the specifications is quite difficult. It's like to problem of finding where the program was before the last jump based on a pointer rather than based on sequence. Or the distinction between receiving a function call and being jumped to by a "go to" statement.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 2) by frojack on Tuesday April 14 2015, @11:31PM

        by frojack (1554) on Tuesday April 14 2015, @11:31PM (#170629) Journal

        Except that a drone, or its pieces, is physical evidence.

        Besides, I program only with "Come From" statements you insensitive clod.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 2) by HiThere on Wednesday April 15 2015, @06:45PM

          by HiThere (866) Subscriber Badge on Wednesday April 15 2015, @06:45PM (#171122) Journal

          A drone is, indeed, physical evidence. So are it's pieces. But they can be difficult to trace already, and as they become commodity items they'll become even more difficult to trace. You may well be able to tell who manufactured it, with a lot more work who first bought it. Getting the second hand purchaser, or the person who stole it if a bit more difficult. And some are already hand crafted from other items (though admittedly the ones I've heard of were quite primitive, along the lines of a repurposed Roomba).

          --
          Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.