Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday March 01 2016, @12:45AM   Printer-friendly
from the half-artificial-half-intelligent dept.

John Markoff writes in the NYT on a new report written by a former Pentagon official who helped establish United States policy on autonomous weapons who argues that autonomous weapons could be uncontrollable in real-world environments where they are subject to design failure as well as hacking, spoofing and manipulation by adversaries. The report contrasts these completely automated systems, which have the ability to target and kill without human intervention, to weapons that keep humans "in the loop" in the process of selecting and engaging targets. "Anyone who has ever been frustrated with an automated telephone call support helpline, an alarm clock mistakenly set to 'p.m.' instead of 'a.m.,' or any of the countless frustrations that come with interacting with computers, has experienced the problem of 'brittleness' that plagues automated systems," Mr. Scharre writes.

The United States military does not have advanced autonomous weapons in its arsenal. However, this year the Defense Department requested almost $1 billion to manufacture Lockheed Martin's Long Range Anti-Ship Missile, which is described as a "semiautonomous" weapon. The missile is controversial because, although a human operator will initially select a target, it is designed to fly for several hundred miles while out of contact with the controller and then automatically identify and attack an enemy ship. As an alternative to completely autonomous weapons, the report advocates what it describes as "Centaur Warfighting." The term "centaur" has recently come to describe systems that tightly integrate humans and computers. Human-machine combat teaming takes a page from the field of "centaur chess," in which humans and machines play cooperatively on the same team. "Having a person in the loop is not enough," says Scharre. "They can't be just a cog in the loop. The human has to be actively engaged."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Francis on Tuesday March 01 2016, @01:25AM

    by Francis (5544) on Tuesday March 01 2016, @01:25AM (#311863)

    It's a ridiculous analogy to make.

    Weapons are designed for the purpose of killing people. If anything goes wrong they'd go on a massacre. In a nightmare scenario they'd go Skynet on us and wipe out every last one of us.

    If a car gets out of control, a few people might die. In an extreme case maybe a couple dozen, but that's it. More likely the car would just come to a stop without hurting anybody or damaging anything.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  

    Total Score:   4  
  • (Score: 2) by c0lo on Tuesday March 01 2016, @01:46AM

    by c0lo (156) on Tuesday March 01 2016, @01:46AM (#311872) Journal

    It's a ridiculous analogy to make.

    Who says the car analogies should be profound?

    Anyway, let's see what we can derive from this.

    Weapons are designed for the purpose of killing people. If anything goes wrong they'd go on a massacre.

    If a car gets out of control, a few people might die. In an extreme case maybe a couple dozen, but that's it.

    Ah, I see... so:
    1. humans cannot be trusted to drive a car, in which the worse that can happen results in a dozen of lifes being lost, but...
    2. ... when facing the risk of a massacre, we must put our trust in humans.

    (my brain melting)

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0
    • (Score: 3, Insightful) by Francis on Tuesday March 01 2016, @02:37AM

      by Francis (5544) on Tuesday March 01 2016, @02:37AM (#311890)

      A car is not a weapon in any sort of reasonable sense. They can be used to kill, but they don't maneuver well enough and are really only useful if the person you're trying to kill doesn't know you're trying to run them down. The more likely problem isn't malice, it's inattentiveness and making mistakes. Allowing AI to handle that makes a lot of sense because of those reasons.

      As far as weapons go, having actual people engage in judgment and with a vested interest in the weapons not destroying all of humanity makes a lot of sense. They might engage in the occasional massacre in the worst cases, but those are small problems compared with the possible robot apocalypse if the AI controlling the weapons is allowed to make too many decisions on our behalf.

      • (Score: 2) by c0lo on Tuesday March 01 2016, @02:49AM

        by c0lo (156) on Tuesday March 01 2016, @02:49AM (#311897) Journal

        A car is not intended to be a weapon in any sort of reasonable sense.

        FTFY... because the hell is paved with good intentions.

        As far as weapons go, having actual people engage in judgment and with a vested interest in the weapons not destroying all of humanity makes a lot of sense.

        Ok, let's reduce the extent of the problem: no longer is humanity at stake, but only a couple of thousands per incident - e.g. in the context of TFS, the wrong ship being hit from several hundred miles away.

        For your convenience, here's the list again:
        1. humans cannot be trusted to drive a car, in which the worse that can happen results in a dozen of lifes being lost, but...
        2. ... when facing the risk of about 2000 lives being lost we must put our trust in humans.

        Does it make more sense?

        --
        https://www.youtube.com/watch?v=aoFiw2jMy-0
        • (Score: 0) by Anonymous Coward on Tuesday March 01 2016, @08:03AM

          by Anonymous Coward on Tuesday March 01 2016, @08:03AM (#312007)

          > A car is not intended to be a weapon in any sort of reasonable sense.

          How about a car with a bomb in it? Or with a gamma source in it?

          The nice thing about a self-driving car bomb is that nobody has to die when the bomb goes off.

        • (Score: 2) by GreatAuntAnesthesia on Tuesday March 01 2016, @02:38PM

          by GreatAuntAnesthesia (3275) on Tuesday March 01 2016, @02:38PM (#312130) Journal

          I think the real issue is that an AI driving a car only has to detect whether that solid object there is likely to be in a collision with the car, and then attempt to avoid the collision. It doesn't much care whether said solid object is a human, a car, a cow or a fallen tree branch.

          An AI missile, however, has to not only make the distinction between a boat and a whale, or a boat and an iceberg, or a boat and a big mass of flotsam: It also has to distinguish between a boat full of hostiles, a boat full of refugees, a boat full of friendly combatants, a boat full of hostages with a handful of enemies aboard and so on, and respond accordingly. This is a much more difficult problem, AI-wise. It is made even more difficult by the fact that in a theatre of war you can be damned sure that your enemies will be doing all they can to confuse and misdirect your AI, something that civilian cars shouldn't have to deal with nearly so often.

      • (Score: 2) by captain normal on Tuesday March 01 2016, @05:17AM

        by captain normal (2205) on Tuesday March 01 2016, @05:17AM (#311945)

        "A car is not a weapon in any sort of reasonable sense."
        So why is someone who tries to run down a law enforcement officer charged with "assault with a deadly weapon"?

        --
        "It is easier to fool someone than it is to convince them that they have been fooled" Mark Twain
      • (Score: 2) by Unixnut on Tuesday March 01 2016, @12:34PM

        by Unixnut (5779) on Tuesday March 01 2016, @12:34PM (#312087)

        > A car is not a weapon in any sort of reasonable sense.

        What makes a weapon is intent. A car is a weapon, as is a hammer, or a rock, depending on the intent of the user.

        Used as a weapon, a car is pretty decent (Even though not designed specifically for the purpose, as you mentioned). The kinetic energy it can reach alone can cause serious damage to a lot of things. High speed impacts into trains, buildings or other vehicles can leave quite a bit of devastation.

        Not even considering that you can pack an autonomous van with explosives and program it to go off and detonate somewhere hundreds of miles from its start point. Yes, you can do it now with suicide bombers, but outside of one particular religion, there are not many people willing to do that. However if you don't need a driver to take the bomb to its destination, I suspect a lot more people would find it an attractive method of voicing their grievances in a forceful manner.

        • (Score: 1) by Francis on Wednesday March 02 2016, @04:17AM

          by Francis (5544) on Wednesday March 02 2016, @04:17AM (#312502)

          If we extend the definition of weapon to include cars the term weapon loses all meaning.

          As you correctly note, it's the use that makes something a weapon. But, comparing a weaponized car with the kinds of military devices that are being used and built specifically as weapons is a bit ridiculous.