Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Monday March 11 2019, @04:36PM   Printer-friendly
from the killbots-and-cream dept.

The U.S. is seeking bids to improve its "basic" killbot to the the point where it can "acquire, identify, and engage targets at least 3X faster than the current manual process."

U.S. Army Assures Public That Robot Tank System Adheres to AI Murder Policy

Why does any of this matter? The Department of Defense Directive 3000.09, requires that humans be able to "exercise appropriate levels of human judgment over the use of force," meaning that the U.S. won't toss a fully autonomous robot into a battlefield and allow it to decide independently whether to kill someone. This safeguard is sometimes called being "in the loop," meaning that a human is making the final decision about whether to kill someone.

Industry Day for the Advanced Targeting and Lethality Automated System (ATLAS) Program. Also at Boing Boing.

Surely these will never be hacked!

Will an operator feel more trepidatious about taking life, due to not being in direct peril themselves? Or less because of greater desensitization? Anyone have any insightful links about drone operator psych outcomes? (Ed: Don't worry about it.)

Related information to inform the philosophical background of why having a human in the loop is required (they don't specify this but e.g. without the human, land mine agreements might start to apply): https://www.act.nato.int/images/stories/media/capdev/capdev_02.pdf

HEY EDITORS! I suggest a new topic: "tech and society" for stuff like this. (Ed: It's Digital Liberty.)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Tuesday March 12 2019, @02:48PM (1 child)

    by Anonymous Coward on Tuesday March 12 2019, @02:48PM (#813276)

    it can "acquire, identify, and engage targets at least 3X faster than the current manual process."

    Let it acquire and engage up to the trigger pull point. Then let the operator perform that last action. Like using the "easy" auto-aim settings on a video game. This system will take the errors out of the aiming and engagement process and result in more accurate firings - when the operator actually pulls the trigger. Less collateral damage is likely. Fewer friendly fire situations should result as well. Going along these lines, the systems allow for much greater accuracy and control. As long as the trigger pull is only performed by the human operator.

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 2) by HiThere on Tuesday March 12 2019, @04:22PM

    by HiThere (866) Subscriber Badge on Tuesday March 12 2019, @04:22PM (#813338) Journal

    I think you're assuming that automated tracking is working flawlessly, and the target doesn't hide behind someone else.

    But you did say "less collateral damage" and "Fewer friendly fire situations", so perhaps you've got a point. I'd feel a lot happier about that point, though, if current history didn't include lots of remote fire control firing on schools and funerals.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.