Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday March 01 2016, @12:45AM   Printer-friendly
from the half-artificial-half-intelligent dept.

John Markoff writes in the NYT on a new report written by a former Pentagon official who helped establish United States policy on autonomous weapons who argues that autonomous weapons could be uncontrollable in real-world environments where they are subject to design failure as well as hacking, spoofing and manipulation by adversaries. The report contrasts these completely automated systems, which have the ability to target and kill without human intervention, to weapons that keep humans "in the loop" in the process of selecting and engaging targets. "Anyone who has ever been frustrated with an automated telephone call support helpline, an alarm clock mistakenly set to 'p.m.' instead of 'a.m.,' or any of the countless frustrations that come with interacting with computers, has experienced the problem of 'brittleness' that plagues automated systems," Mr. Scharre writes.

The United States military does not have advanced autonomous weapons in its arsenal. However, this year the Defense Department requested almost $1 billion to manufacture Lockheed Martin's Long Range Anti-Ship Missile, which is described as a "semiautonomous" weapon. The missile is controversial because, although a human operator will initially select a target, it is designed to fly for several hundred miles while out of contact with the controller and then automatically identify and attack an enemy ship. As an alternative to completely autonomous weapons, the report advocates what it describes as "Centaur Warfighting." The term "centaur" has recently come to describe systems that tightly integrate humans and computers. Human-machine combat teaming takes a page from the field of "centaur chess," in which humans and machines play cooperatively on the same team. "Having a person in the loop is not enough," says Scharre. "They can't be just a cog in the loop. The human has to be actively engaged."


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Unixnut on Tuesday March 01 2016, @12:34PM

    by Unixnut (5779) on Tuesday March 01 2016, @12:34PM (#312087)

    > A car is not a weapon in any sort of reasonable sense.

    What makes a weapon is intent. A car is a weapon, as is a hammer, or a rock, depending on the intent of the user.

    Used as a weapon, a car is pretty decent (Even though not designed specifically for the purpose, as you mentioned). The kinetic energy it can reach alone can cause serious damage to a lot of things. High speed impacts into trains, buildings or other vehicles can leave quite a bit of devastation.

    Not even considering that you can pack an autonomous van with explosives and program it to go off and detonate somewhere hundreds of miles from its start point. Yes, you can do it now with suicide bombers, but outside of one particular religion, there are not many people willing to do that. However if you don't need a driver to take the bomb to its destination, I suspect a lot more people would find it an attractive method of voicing their grievances in a forceful manner.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1) by Francis on Wednesday March 02 2016, @04:17AM

    by Francis (5544) on Wednesday March 02 2016, @04:17AM (#312502)

    If we extend the definition of weapon to include cars the term weapon loses all meaning.

    As you correctly note, it's the use that makes something a weapon. But, comparing a weaponized car with the kinds of military devices that are being used and built specifically as weapons is a bit ridiculous.