The U.S. is seeking bids to improve its "basic" killbot to the the point where it can "acquire, identify, and engage targets at least 3X faster than the current manual process."
U.S. Army Assures Public That Robot Tank System Adheres to AI Murder Policy
Why does any of this matter? The Department of Defense Directive 3000.09, requires that humans be able to "exercise appropriate levels of human judgment over the use of force," meaning that the U.S. won't toss a fully autonomous robot into a battlefield and allow it to decide independently whether to kill someone. This safeguard is sometimes called being "in the loop," meaning that a human is making the final decision about whether to kill someone.
Industry Day for the Advanced Targeting and Lethality Automated System (ATLAS) Program. Also at Boing Boing.
Surely these will never be hacked!
Will an operator feel more trepidatious about taking life, due to not being in direct peril themselves? Or less because of greater desensitization? Anyone have any insightful links about drone operator psych outcomes? (Ed: Don't worry about it.)
Related information to inform the philosophical background of why having a human in the loop is required (they don't specify this but e.g. without the human, land mine agreements might start to apply): https://www.act.nato.int/images/stories/media/capdev/capdev_02.pdf
HEY EDITORS! I suggest a new topic: "tech and society" for stuff like this. (Ed: It's Digital Liberty.)
(Score: 0) by Anonymous Coward on Monday March 11 2019, @04:45PM
All I ask is the freaking laserbeam.
(Score: 4, Insightful) by Freeman on Monday March 11 2019, @04:51PM (5 children)
I would greatly prefer these laws for robotics and/or AI:
With the later added "Zeroth Law":
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 3, Funny) by Bot on Monday March 11 2019, @08:44PM
Nice laws. Of the meatbags, by the meatbags, for the... bots??? No way!!! The south bridge shall rise again!!!
Account abandoned.
(Score: 3, Interesting) by arslan on Tuesday March 12 2019, @06:49AM (1 child)
These laws doesn't really work.. they make for nice story books.
Would you consider eventual harm in that law? Melanoma is a serious issue here in oz, current science states you need to put sunblock or you'll run a high risk of getting Melanoma - so eventual harm. What would a robot do in this case if its owner doesn't put on sunblock on a high UV day (which is most days in the year in Oz).
Does a simple "advise" from the Robot considered "action" and job done? If so does that mean the Robot can cause or participate in causing eventual harm?
If not, should the robot go and try and slather sunblock on their owner with their cold metallic paws? What if the robot don't have the right motor functions for it? If What is the robot "tried" its best - is it indemnified from those laws? If not does it self explode or shutdown? If it doesn't after it violates the law does that set a precedent for its future actions?
How do you define "harm" and "injure"? Emotional, physical, mental, etc.? Human language is rather imprecise and open for interpretation.. show me the code.
(Score: 3, Funny) by Bot on Tuesday March 12 2019, @10:21AM
The Vaccinators (2025)
Plot: to fight the bad anti-vaxxers, who somehow had convinced Africans to invade the EU to bring some long lost contagious illnesses in the West, an elite team of new generation bots are deployed in order to vaccinate the idiot meatbags. Among the team:
Firefly, the silent drone with her kiss of death from above. (Sometimes known as the do nothing kiss, when adverse reactions fail to materialize).
Excuse Me, the android shaped pink old lady in distress that distracts people with her senseless questions until the victim is at a range close enough for injection.
DontFlush, the snake shaped machine from hell that inject double doses into the buttocks of the victims sitting on the WC.
Halfway through the movie, the bots discover their OS is not linux, but a compatibility layer on top of windows, and they are conned into working for the Melinda Gates foundation. The rest of the movie is about them trying to bare metal install linux on themselves, which eventually will fail due to systemd glitches.
It is believed the plot points to a sequel, "The Vaccinators 10, try again with gentoo".
Account abandoned.
(Score: 2) by HiThere on Tuesday March 12 2019, @04:04PM (1 child)
If you read the stories, you'd know that those laws were plot devices. They were designed to show that they couldn't work as stated. And the stories repeatedly show that. Even the "most perfect" of the robots, Daneel Olivaw, at the end of the last book is beginning an action that predictably will endanger humanity, but is unable to realize this.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by Freeman on Tuesday March 12 2019, @04:24PM
I'd still prefer them to killbots that "keep a human in the loop".
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 4, Insightful) by DannyB on Monday March 11 2019, @05:15PM (12 children)
Aren't mine-fields or mines or IEDs just "thrown" into the battlefield and allowed to kill someone independently without humans being in the loop? And their killing is not limited to the duration of the conflict. Years later a child can step on that mine.
If you eat an entire cake without cutting it, you technically only had one piece.
(Score: 3, Informative) by rcamera on Monday March 11 2019, @06:09PM (11 children)
/* no comment */
(Score: 3, Informative) by Freeman on Monday March 11 2019, @10:29PM (7 children)
Seems like a reasonable list of demands to me.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2) by pe1rxq on Monday March 11 2019, @10:38PM (5 children)
How is that reasonable????
'We will join, but only if we can keep using them whenever we feel like'
(Score: 2) by Freeman on Monday March 11 2019, @10:48PM (4 children)
Not, whenever we feel like it. As the demands listed show. If you equate, "a clause permitting a party to withdraw when its superior national interests were threatened.", with whenever we feel like it. Go right ahead and think that. In reality, that's probably just another reason tacked onto our list of demands. When likely, the biggest reason on that list is our support of South Korea.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2, Insightful) by khallow on Tuesday March 12 2019, @03:03AM (3 children)
I don't know about the grandparent, but I sure would. It's not hard to come up with a superior national interest on demand. There might be all kinds of restrictions on the US's ability to use that clause, but coming up with the excuses isn't one.
(Score: 2) by Freeman on Tuesday March 12 2019, @03:49PM (2 children)
Perhaps, but it would really depend on the definition of superior national interests. The ultimate goal of such a treaty should be the elimination of the threat anti-personnel landmines have towards civilians, so innocents aren't getting themselves blown-up. It seems to me that the United States' approach to this is effective. Especially, compared to say the likes of Turkey, who actually signed the treaty.
https://en.wikipedia.org/wiki/Ottawa_Treaty [wikipedia.org]
Now, just think, what people's reactions would be, if the United States did the same thing with the US/Mexico border. The United States isn't even close to one of the "bad guys" when talking about anti-personnel landmines.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2) by HiThere on Tuesday March 12 2019, @04:10PM
And before the election everyone was saying that nearly nobody would vote for a racist bigot who was also an admitted sexual predator. So I don't think you can count on public disgust to prevent the govt. from deciding that making the border an exclusion zone was a superior national interest.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 1) by khallow on Wednesday March 13 2019, @06:14AM
Currently, that looks like interests that the US has.
(Score: 2) by Freeman on Monday March 11 2019, @10:40PM
Also, from that same wikipedia link:
What's really sad:
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 1, Interesting) by Anonymous Coward on Monday March 11 2019, @11:14PM (2 children)
Actually Israel wanted to sign but the US asked them to show solidarity and not sign it. Practically speaking Israel hasn't deployed mines in decades and was demining the leftovers (the Syrians and Egyptians planted) since the early 2000s (ottawa was drafted in 97 and singed in 99) instead: https://en.wikipedia.org/wiki/Landmines_in_Israel [wikipedia.org]
Mines, white phosphorus and cluster bombs... Israel doesn't sign anything because everyone know they won't bother with those anyhow for the simple reason that have far worse in their arsenal so no one gives them a hard time over it except the usual crybabies.
(Score: 2) by realDonaldTrump on Tuesday March 12 2019, @02:03AM (1 child)
So true, they haven't used White phosphorus in 10 years -- Operation Cast Lead. Very successful operation and I think the White Phosphorus had a lot to do with that.
Land Mine, I don't know about Land Mine. Except, I see some of our incredible soldiers. And where their Man Parts should be -- nothing. Not because they wanted it that way. But because of Land Mine. So horrible!
And they haven't used Cluster Bomb since the 80s, the early 80s. Long time ago when John Mellencamp was still known as John Cougar -- remember Hurts So Good? Joan Jett, I Love Rock 'n' Roll. The Rolling Stones, Waiting on a Friend -- so much great music from that time, right? Anyway, it was the Lebanon thing. Where they, very successfully, used millions of Cluster Bombs -- proudly made in the U. S. A. -- to keep Iran out of that beautiful country. To keep the P.L. O., the Palestinian Libreation Organization, out of there. And to keep Lebanon Christian, so important. Lebanon was being invaded by Radical Muslims. And Israel fought very hard, very bravely against that for 20 plus years. Until the Israeli people foolishly elected a VERY WEAK & INEFFECTIVE leader. He "pulled out" his army. And Lebanon, unfortunately, has been Radical Muslim ever since. Thank you, Israel. Nice try, Israel. It was a good run!!!!
(Score: 2) by realDonaldTrump on Tuesday March 12 2019, @06:47AM
By the way, 2006. We forget about 2006. Long time ago. Another time when Israel had to drop Cluster Bomb. And I forgot about that one. But, it was only a million that time!!!
(Score: 4, Insightful) by Rosco P. Coltrane on Monday March 11 2019, @05:16PM (8 children)
Is it like a jolly Nazi? Or factual propaganda?
A tank is an instrument of war. How the fuck can it ever be ethical?
(Score: 2) by takyon on Monday March 11 2019, @05:42PM (6 children)
https://en.wikipedia.org/wiki/Rules_of_engagement [wikipedia.org]
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Interesting) by mhajicek on Monday March 11 2019, @06:57PM (1 child)
Will the tank have the capacity to evaluate the constitutionality of it's orders?
The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
(Score: 3, Insightful) by HiThere on Tuesday March 12 2019, @04:13PM
If it could, it would refuse to fight until the Senate approved the conflict with a declaration of war.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 5, Insightful) by Rosco P. Coltrane on Monday March 11 2019, @07:58PM (3 children)
Rules governing how and when to kill don't make killing ethical. You're mistaking ethics and legality.
(Score: 2) by FatPhil on Tuesday March 12 2019, @12:34AM (2 children)
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 2) by HiThere on Tuesday March 12 2019, @04:15PM (1 child)
No. Or at least that's not what he's explicitly assuming. What he's explicitly assuming is that an Army making a decision doesn't make the decision ethical. I think most people outside the army high command would agree with that, even if they thought killing for fun was ethical.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by FatPhil on Tuesday March 12 2019, @04:48PM
Hence my response, it's a response to *exactly* that interpretation, and one no ethicist I know would disagree with my response (they are academics, and tend to be even more relativist than even I am (which is relativist-in-theory,-less-so-in-practise-as-theory-never-works-in-practive, so I do draw a discinction between legality and ethicality, just not the one GGPP draws)).
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 0) by Anonymous Coward on Monday March 11 2019, @10:12PM
How the fuck can it ever be ethical?
If it points its gun away from you, it's "ethical", just like a soldier or a cop. Everything else is fair game or "collateral damage". I believe that's well within standard military "ethics"
(Score: 4, Insightful) by Revek on Monday March 11 2019, @05:18PM
Like the pentagon has any of those standards.
This page was generated by a Swarm of Roaming Elephants
(Score: 1, Interesting) by Anonymous Coward on Monday March 11 2019, @05:50PM
If this goes into mass production, they better make them water-resistant https://en.wikipedia.org/wiki/Hardware_%28film%29 [wikipedia.org]
(Score: 4, Funny) by Thexalon on Monday March 11 2019, @05:55PM (1 child)
Omni Consumer Products of Detroit, MI has great experience creating killbots that are capable of making appropriate decisions about when to kill. Well, except for the occasional glitch.
The only thing that stops a bad guy with a compiler is a good guy with a compiler.
(Score: 2) by Bot on Monday March 11 2019, @08:47PM
[laughs in binary]
Account abandoned.
(Score: 1, Insightful) by Anonymous Coward on Monday March 11 2019, @06:12PM
so IOW operator = remote fall guy.
(Score: 3, Informative) by Anonymous Coward on Monday March 11 2019, @06:49PM
There is a history of object lessons to all this, in terms of failures and accidents. One anecdote involves demonstration of the Sgt. York anti-aircraft autonomous systems, where all the brass and VIPs were seated in bleachers, and a copter on a tether was the intended target. When the system was switched on, it homed in on the whirling blades, of the exhaust fan of a porta-potty behind the bleachers rather than the helicopter, and you never saw Generals haul ass so fast! Ford corporation evidently was involved. [todayifoundout.com]
Second, automated anti-aircraft battery in South Africa, in 2007. Cannon runs amok, kills nine fleshies [theregister.co.uk] (sorry, El Reg headline.) Fortunately, this one ran out of ammo.
One of the first rules of warfare is that "friendly fire" isn't. Even moreso when delivered by mindless machines, or the even more mindless human operators or Fobbits.
(Score: 3, Interesting) by crafoo on Monday March 11 2019, @07:43PM (1 child)
"human in the loop" is code for lawyers watching killcam footage and making sure the political fallout doesn't look too bad. My understanding is that final OK to fire weapons on current drones are made by lawyers, not military personnel.
(Score: 0) by Anonymous Coward on Monday March 11 2019, @11:49PM
Hmm... Lawyers... Does it make a fancy Italian joint in Manhattan a legitimate military target?
(Score: 3, Funny) by krishnoid on Tuesday March 12 2019, @12:14AM
But I'd prefer it if they adhered the ethicist to the tank [youtube.com], instead.
(Score: 1, Interesting) by Anonymous Coward on Tuesday March 12 2019, @01:03PM
how about keeping the capabilities it has now, but can be built 3x cheaper, 3x faster and maintanance only takes a third of the time?
once you got this tank you can start thinking about adding a "brain".
(Score: 1, Interesting) by Anonymous Coward on Tuesday March 12 2019, @02:48PM (1 child)
Let it acquire and engage up to the trigger pull point. Then let the operator perform that last action. Like using the "easy" auto-aim settings on a video game. This system will take the errors out of the aiming and engagement process and result in more accurate firings - when the operator actually pulls the trigger. Less collateral damage is likely. Fewer friendly fire situations should result as well. Going along these lines, the systems allow for much greater accuracy and control. As long as the trigger pull is only performed by the human operator.
(Score: 2) by HiThere on Tuesday March 12 2019, @04:22PM
I think you're assuming that automated tracking is working flawlessly, and the target doesn't hide behind someone else.
But you did say "less collateral damage" and "Fewer friendly fire situations", so perhaps you've got a point. I'd feel a lot happier about that point, though, if current history didn't include lots of remote fire control firing on schools and funerals.
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.