Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Tuesday April 14 2015, @11:38AM   Printer-friendly
from the actually-taken-over-by-Cybermen dept.

The UK is opposing international efforts to ban "lethal autonomous weapons systems" (Laws) at a week-long United Nations session in Geneva:

The meeting, chaired by a German diplomat, Michael Biontino, has also been asked to discuss questions such as: in what situations are distinctively human traits, such as fear, hate, sense of honour and dignity, compassion and love desirable in combat?, and in what situations do machines lacking emotions offer distinct advantages over human combatants?

The Campaign to Stop Killer Robots, an alliance of human rights groups and concerned scientists, is calling for an international prohibition on fully autonomous weapons.

Last week Human Rights Watch released a report urging the creation of a new protocol specifically aimed at outlawing Laws. Blinding laser weapons were pre-emptively outlawed in 1995 and combatant nations since 2008 have been required to remove unexploded cluster bombs.

[...] The Foreign Office told the Guardian: "At present, we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area. The United Kingdom is not developing lethal autonomous weapons systems, and the operation of weapons systems by the UK armed forces will always be under human oversight and control. As an indication of our commitment to this, we are focusing development efforts on remotely piloted systems rather than highly automated systems."

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by VLM on Tuesday April 14 2015, @01:36PM

    by VLM (445) on Tuesday April 14 2015, @01:36PM (#170382)

    http://www.stopkillerrobots.org/the-problem/

    Its interesting to see how failure never enters their consciousness about the problem.

    If some E-4 on our gov side can go in over a network and load up a targeting pix of some terrorist, knowing how poorly security is traditionally implemented, "the bad guys" (at least from the PoV of our .gov) can go in over the same network the .gov E-4 had used and load up a targeting pic of our own government members for the LOLz. In fact, not just "can" but "will". When you look at the ratio of sheer number of human brains on each side, it appears likely the net long term effect of robots will be a lot of "own goal" "blue on blue fratricide" type stuff. If you've got the most people and you're operating solely defensively, then the net human brainpower might win and your robot warriors might save you... but why would anyone be attacking you if you're a nice guy (aka the opposite of the USA government?)

    Also its assumed the dang things will actually work. Insert all the tired old arguments about the Patriot missile batteries in the first gulf war being either perfect or perfectly useless depending on ops political leanings and axe to grind. I can guarantee they'll make the contractors back home a lot of dough, which is all that really matters in the end. But they might not actually "do" anything from a military goal standpoint other than burn money and logistics capacity.

    The final failure is assuming that something robots can control, which used to be hard/expensive for humans to control, actually matters in modern warfare. Sure... control the skies all you want with AI autonomous robots, or empty kill zone former farm fields or whatever ... as recent events in the middle east show, the real problem is you're still not going to control the ground if the civilians all hate you and the roads are lined with IEDs and snipers. If the number of vest wearing suicide bombers is higher than the number of media-acceptable soldier casualties, then "they" win, regardless of relative tech levels, as they just did in the middle east where the USA won every battle while losing the war (which sounds a lot like Vietnam, BTW) And nothing manufactures vest wearing suicide bombers more effectively than robot missile drone strikes causing random slaughter of innocent civilians at weddings or whatever. I'm sure that empty field guarded by robot sentry will be enemy free, not that it matters WRT achieving the goals of the war or just a respectable retreat, meanwhile every time the convoy moves out (with that fat logistics tail the robots require) the body bags pour back, accomplishing nothing, until the war is lost.

    Insert alfred e neuman "what me worry?". If you hand wave away all the inherent problems, robots could be quite the sci fi book issue. I think they'll be staying in the sci fi books of course, see above.

    Historically militaries have always geared up for the last war. So we had a multi-decade glut of useless battleships and aircraft carriers and tanks and airmobile helicopters. All of which will just be giant bullseye target deathtraps in the next war. Robots will be like this. For financial / cultural reasons we'll have to have them everywhere as the highest priority until we get tired of losing with them. Then after enough deaths we can get rid of them and try something that actually works. Or just lose another war, more likely.

    Starting Score:    1  point
    Moderation   +4  
       Insightful=2, Interesting=2, Total=4
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 0) by Anonymous Coward on Tuesday April 14 2015, @04:47PM

    by Anonymous Coward on Tuesday April 14 2015, @04:47PM (#170452)

    but why would anyone be attacking you if you're a nice guy

    Because that other guy is not a nice guy?

    I could give you a good example, but then I would Godwin this thread.

    • (Score: 3, Insightful) by VLM on Tuesday April 14 2015, @09:53PM

      by VLM (445) on Tuesday April 14 2015, @09:53PM (#170574)

      In all fairness "he who should not be named" had a fear of his larger neighbor to the east getting into an empire building mood, which turned out to be correct, so he figured his only hope was to get them before they got him. And his neighbors to the west were obnoxious jerks who destroyed his countries economy and he used the turmoil to gain power, so he knows they're not exactly his best friends AND if they destabilize his country again this time it'll be his head rollin' when the revolutionaries start marching. Also he knew he could trivially beat, smash even, just one front, but if two fronts open then his country loses the war AGAIN so the only possible strategy is to smash the west and wheel around and smash the east.

      And the whole mess started back in 1914 because his neighbor, more or less, to the SE collapsed and his rival to the east thought it would be fun to take over the world by taking over the Ottoman empire.

      Now he was pretty much a jackass aside from that, but he pretty much did what he had to do, a saint might have lowered the death counts a bit, but only a bit. Nobody in a position of power leading one of the major powers in that entire hemisphere was a nice guy. There were plenty of nice guys in that hemisphere who got totally screwed, but the only thing they all had in common was none of them had any serious political power. A whole hemisphere where the major powers were all led by bloodthirsty lunatics. Europe was a total clusterfuck for the entire first half of the century.

      • (Score: 0) by Anonymous Coward on Wednesday April 15 2015, @04:03PM

        by Anonymous Coward on Wednesday April 15 2015, @04:03PM (#171030)

        In all fairness "he who should not be named" had a fear of his larger neighbor to the east getting into an empire building mood, which turned out to be correct, so he figured his only hope was to get them before they got him.

        Which just shifts the example for the argument to that larger neighbour to the east.

  • (Score: 2) by HiThere on Tuesday April 14 2015, @08:25PM

    by HiThere (866) Subscriber Badge on Tuesday April 14 2015, @08:25PM (#170531) Journal

    Sorry, but that would be a lousy science fiction story. In science fiction stories people worry about system failures...except, occasionally, someone who would be the villain, if they weren't so stupid they didn't realize what they were doing. That latter takes a huge amount of skill to make itself believable. For some reason people find malice easier to believe than stupidity.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.