Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.

Submission Preview

Link to Story

Pentagon reassures public that its autonomous robotic tank adheres to "legal and ethical standards"

Accepted submission by Anonymous Coward http://"" at 2019-03-10 05:05:23 from the Killbots and Cream dept.
Hardware

USA is seeking bids to improve it's "basic" killbot to the the point where it can "acquire, identify, and engage targets at least 3X faster than the current manual process."

"Why does any of this matter? The Department of Defense Directive 3000.09, requires that humans be able to “exercise appropriate levels of human judgment over the use of force,” meaning that the U.S. won’t toss a fully autonomous robot into a battlefield and allow it to decide independently whether to kill someone. This safeguard is sometimes called being “in the loop,” meaning that a human is making the final decision about whether to kill someone. " - https://gizmodo.com/u-s-army-assures-public-that-robot-tank-system-adheres-1833061674 [gizmodo.com] (and https://www.fbo.gov/index.php?s=opportunity&mode=form&id=29a4aed941e7e87b7af89c46b165a091&tab=core&_cview=0 [fbo.gov] both via https://boingboing.net/2019/03/09/atlas-shot.html) [boingboing.net]

Surely these will never be hacked!

Will an operator feel more trepeditious about taking life, due eg. to not being in direct peril themselves? Or less eg. because of a stronger desensitization? Anyone have insightful links to drone operator psych outcomes?

Related information to inform the philisophical background of why having a human in the loop is required (they don't specify this but eg. without the human, land mine agreements might start to apply): https://www.act.nato.int/images/stories/media/capdev/capdev_02.pdf [nato.int]

HEY EDITORS! I suggest a new topic: "tech and society" for stuff like this.


Original Submission