John Markoff writes in the NYT on a new report written by a former Pentagon official who helped establish United States policy on autonomous weapons who argues that autonomous weapons could be uncontrollable in real-world environments where they are subject to design failure as well as hacking, spoofing and manipulation by adversaries. The report contrasts these completely automated systems, which have the ability to target and kill without human intervention, to weapons that keep humans "in the loop" in the process of selecting and engaging targets. "Anyone who has ever been frustrated with an automated telephone call support helpline, an alarm clock mistakenly set to 'p.m.' instead of 'a.m.,' or any of the countless frustrations that come with interacting with computers, has experienced the problem of 'brittleness' that plagues automated systems," Mr. Scharre writes.
The United States military does not have advanced autonomous weapons in its arsenal. However, this year the Defense Department requested almost $1 billion to manufacture Lockheed Martin's Long Range Anti-Ship Missile, which is described as a "semiautonomous" weapon. The missile is controversial because, although a human operator will initially select a target, it is designed to fly for several hundred miles while out of contact with the controller and then automatically identify and attack an enemy ship. As an alternative to completely autonomous weapons, the report advocates what it describes as "Centaur Warfighting." The term "centaur" has recently come to describe systems that tightly integrate humans and computers. Human-machine combat teaming takes a page from the field of "centaur chess," in which humans and machines play cooperatively on the same team. "Having a person in the loop is not enough," says Scharre. "They can't be just a cog in the loop. The human has to be actively engaged."
(Score: 3, Insightful) by c0lo on Tuesday March 01 2016, @01:00AM
Weapon = any tool powerful/dangerous enough to create damage (starting with a kitchen knife and ending with "No, Mr. Bond. I expect you to die!")
If you agree with the above, why military weapons should have to include a human in the loop, while self-driving cars shouldn't?
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 1) by khallow on Tuesday March 01 2016, @01:22AM
Weapon = any tool powerful/dangerous enough to create damage (starting with a kitchen knife and ending with "No, Mr. Bond. I expect you to die!")
Like elevators?
(Score: 4, Insightful) by Francis on Tuesday March 01 2016, @01:25AM
It's a ridiculous analogy to make.
Weapons are designed for the purpose of killing people. If anything goes wrong they'd go on a massacre. In a nightmare scenario they'd go Skynet on us and wipe out every last one of us.
If a car gets out of control, a few people might die. In an extreme case maybe a couple dozen, but that's it. More likely the car would just come to a stop without hurting anybody or damaging anything.
(Score: 2) by c0lo on Tuesday March 01 2016, @01:46AM
Who says the car analogies should be profound?
Anyway, let's see what we can derive from this.
Ah, I see... so:
1. humans cannot be trusted to drive a car, in which the worse that can happen results in a dozen of lifes being lost, but...
2. ... when facing the risk of a massacre, we must put our trust in humans.
(my brain melting)
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 3, Insightful) by Francis on Tuesday March 01 2016, @02:37AM
A car is not a weapon in any sort of reasonable sense. They can be used to kill, but they don't maneuver well enough and are really only useful if the person you're trying to kill doesn't know you're trying to run them down. The more likely problem isn't malice, it's inattentiveness and making mistakes. Allowing AI to handle that makes a lot of sense because of those reasons.
As far as weapons go, having actual people engage in judgment and with a vested interest in the weapons not destroying all of humanity makes a lot of sense. They might engage in the occasional massacre in the worst cases, but those are small problems compared with the possible robot apocalypse if the AI controlling the weapons is allowed to make too many decisions on our behalf.
(Score: 2) by c0lo on Tuesday March 01 2016, @02:49AM
FTFY... because the hell is paved with good intentions.
Ok, let's reduce the extent of the problem: no longer is humanity at stake, but only a couple of thousands per incident - e.g. in the context of TFS, the wrong ship being hit from several hundred miles away.
For your convenience, here's the list again:
1. humans cannot be trusted to drive a car, in which the worse that can happen results in a dozen of lifes being lost, but...
2. ... when facing the risk of about 2000 lives being lost we must put our trust in humans.
Does it make more sense?
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 0) by Anonymous Coward on Tuesday March 01 2016, @08:03AM
> A car is not intended to be a weapon in any sort of reasonable sense.
How about a car with a bomb in it? Or with a gamma source in it?
The nice thing about a self-driving car bomb is that nobody has to die when the bomb goes off.
(Score: 2) by GreatAuntAnesthesia on Tuesday March 01 2016, @02:38PM
I think the real issue is that an AI driving a car only has to detect whether that solid object there is likely to be in a collision with the car, and then attempt to avoid the collision. It doesn't much care whether said solid object is a human, a car, a cow or a fallen tree branch.
An AI missile, however, has to not only make the distinction between a boat and a whale, or a boat and an iceberg, or a boat and a big mass of flotsam: It also has to distinguish between a boat full of hostiles, a boat full of refugees, a boat full of friendly combatants, a boat full of hostages with a handful of enemies aboard and so on, and respond accordingly. This is a much more difficult problem, AI-wise. It is made even more difficult by the fact that in a theatre of war you can be damned sure that your enemies will be doing all they can to confuse and misdirect your AI, something that civilian cars shouldn't have to deal with nearly so often.
(Score: 2) by captain normal on Tuesday March 01 2016, @05:17AM
"A car is not a weapon in any sort of reasonable sense."
So why is someone who tries to run down a law enforcement officer charged with "assault with a deadly weapon"?
"It is easier to fool someone than it is to convince them that they have been fooled" Mark Twain
(Score: 2) by Unixnut on Tuesday March 01 2016, @12:34PM
> A car is not a weapon in any sort of reasonable sense.
What makes a weapon is intent. A car is a weapon, as is a hammer, or a rock, depending on the intent of the user.
Used as a weapon, a car is pretty decent (Even though not designed specifically for the purpose, as you mentioned). The kinetic energy it can reach alone can cause serious damage to a lot of things. High speed impacts into trains, buildings or other vehicles can leave quite a bit of devastation.
Not even considering that you can pack an autonomous van with explosives and program it to go off and detonate somewhere hundreds of miles from its start point. Yes, you can do it now with suicide bombers, but outside of one particular religion, there are not many people willing to do that. However if you don't need a driver to take the bomb to its destination, I suspect a lot more people would find it an attractive method of voicing their grievances in a forceful manner.
(Score: 1) by Francis on Wednesday March 02 2016, @04:17AM
If we extend the definition of weapon to include cars the term weapon loses all meaning.
As you correctly note, it's the use that makes something a weapon. But, comparing a weaponized car with the kinds of military devices that are being used and built specifically as weapons is a bit ridiculous.
(Score: 2) by AnonTechie on Tuesday March 01 2016, @09:23AM
What about Paperclip maximizer [lesswrong.com] ?
Albert Einstein - "Only two things are infinite, the universe and human stupidity, and I'm not sure about the former."
(Score: 2) by Valkor on Tuesday March 01 2016, @01:05AM
We're all half-centaur.
(Score: 0) by Anonymous Coward on Tuesday March 01 2016, @06:16AM
(Score: 3, Insightful) by bob_super on Tuesday March 01 2016, @01:09AM
> it is designed to fly for several hundred miles while out of contact with the controller and then automatically identify and attack an enemy ship.
Contrast to current cruise missile, which fly for hundreds of miles and automatically use a mix of inertial guidance, GPS and cameras to identify the right building and try to hit a wall opening.
Sorting out a capital ship from its escorts and the waves sounds much simpler...
(Score: 3, Interesting) by art guerrilla on Tuesday March 01 2016, @01:20AM
from the time when oog and pals raided boog's tribe and stole their flint and wenches, up until approx WW1, the military to civilian death ratio was approx 9 to 1...
from about WW1 on, it has flipped, where it is now 90% civilian deaths, 10% military...
um, is there anyone else who thinks this might be a bug, not a feature ? ? ?
(Score: 2, Interesting) by Francis on Tuesday March 01 2016, @01:27AM
There's a few things that happened there. Up until then militaries mostly met each other on fields for battles. And the weapons weren't that powerful. Try accidentally killing civilians with a sword some time and you'll see what I mean. Whereas a bullet can travel rather far if you miss your target.
Also, bombing raids and deliberately targeting civilians are relatively new occurrences. I'm sure they happened in the past, but not to the extent we've seen in the 20th and 21st centuries.
(Score: 3, Insightful) by bob_super on Tuesday March 01 2016, @01:52AM
Starving everyone within the walls was the point of castle sieges.
Flinging a few decaying bodies to cause diseases is commonly referred to as the first case of biological warfare.
In both cases, the civilians, amateur defense forces, are also a target.
Also, while less civilians died on the battlefield, traveling or occupying armies were always causing a lot of "collateral" damage, with or without weapons.
(Score: 4, Informative) by c0lo on Tuesday March 01 2016, @01:59AM
Ah, yes... raining rocks and fire from catapults over a fortified township full of civilians is indeed recent (at geological scales, perhaps).
Notable mention to the "Crush your enemies. See them driven before you. Hear the lamentations of their women."
Not for the lack of trying, no. It was only due to the limitations of their weapons.
It is the protection of civilians at the time of war that is recent:
Convention (IV) relative to the Protection of Civilian Persons in Time of War. Geneva, 12 August 1949. [icrc.org]
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 3, Insightful) by deadstick on Tuesday March 01 2016, @04:45AM
bombing raids and deliberately targeting civilians are relatively new occurrences.
Leave out the aerial bombing and read Joshua 6:21.
(Score: 0) by Anonymous Coward on Tuesday March 08 2016, @04:19AM
Genghis Khan
The movie 'Hero'
United State's takeover of western North America.
I'm sure there are dozens of other examples.
Civilian populace has been targetted since time immemorial. And if they are not outright butchered, their males often are and their females raped and usually impregnated to ensure subjugation is complete.
(Score: 0) by Anonymous Coward on Tuesday March 01 2016, @02:45AM
You know how I know you never read Thucydides?
(Score: 0) by Anonymous Coward on Tuesday March 01 2016, @06:44AM
In the past everyone who could throw a stone was military. I dunno what history you've been reading but in the old days wiping out entire villages, towns and cities wasn't that uncommon. Keeping fertile women alive is common and so was enslaving the able bodied ones, but also common was completely killing everyone.
Nowadays the military is the tip of the spear, and rest of the spear is the industrial and population base. Those ships, planes, tanks, and munitions aren't built by soldiers nor are the raw materials mined by soldiers. So of course you destroy the spear.
In the old days before the agricultural or industrial revolution killing everyone could be more beneficial. In those days one farmer couldn't feed that many others and there weren't decent birth control methods. When agricultural and industrial productivity was low if you're going to take over the land and resources, eliminating the current inhabitants isn't such a bad move. But as technology and productivity improved, collecting tribute/taxes started becoming more and more attractive.
Then when productivity improved even more and "norms became more civilized", trade actually makes more and more sense and war less so. And you start hearing leaders say stuff like:
"Why, of course, the people don't want war," Goering shrugged. "Why would some poor slob on a farm want to risk his life in a war when the best that he can get out of it is to come back to his farm in one piece. Naturally, the common people don't want war; neither in Russia nor in England nor in America, nor for that matter in Germany. That is understood. But, after all, it is the leaders of the country who determine the policy and it is always a simple matter to drag the people along, whether it is a democracy or a fascist dictatorship or a Parliament or a Communist dictatorship."
In the ancient days, that poor slob on a farm was often willing to risk his life in a war because he had much to gain too: gold, slaves (including sex slaves), his own land (he might not have owned the farm he was working on).
Today that poor slob is risking his life just to make the Military Industrial Complex and their friends richer and more powerful. But of course they tell that poor slob he is defending his country/religion/family. And those poor slobs are usually stupid enough to believe it.
(Score: 1) by redneckmother on Tuesday March 01 2016, @02:52AM
"They can't be just a cog in the loop. The human has to be actively engaged."
And, a human MUST make the final decision for "go-nogo", else the "weapon" must NOT take out the presumed target.
p.
Mas cerveza por favor.
(Score: 0) by Anonymous Coward on Tuesday March 01 2016, @02:59AM
You have a squadron of ten guys, each with a separate passkey. The AV can be controlled by one designated passkey (the operator/guide) but disabled by either the designated passkey or by 3 of the other 9, in case the AV falls into enemy hands. That kind of thing.
(Score: 2) by VanderDecken on Tuesday March 01 2016, @04:00AM
Pfft. Wrong feedback loop for combat operations. If for no other reasons that at any given time 7 of your 9 with "passkeys" may already be dead. So much for your weapon system.
One person is the operator. For significant targets or weapons with greater destruction, you have at least one other person (sometimes an officer, depending on the circumstances) validating the target and the decision to engage, but it's a check protocol and not technologically enforced. Otherwise the operator Is It.
The feedback loop is simple. If the killing is justified, that's warfare. If it's unjustified, it's murder and we have military courts to deal with it. Take the human out of the loop, and you no longer have the feedback loop as a check.
For brevity, I'm not going into what constitutes justified vs unjustified, but anyone who has studied military law should have a good feeling for the difference.
I am a software architect and was once a soldier. I *want* humans in the loop. Fully autonomous weapon systems scare the shit out of me.
The two most common elements in the universe are hydrogen and stupidity.
(Score: 3, Funny) by Bogsnoticus on Tuesday March 01 2016, @03:11AM
Would the fitting insult to fling at your typical steroid-infused modern centaur be;
"You're hung like a 9mm"
Genius by birth. Evil by choice.
(Score: 2) by Dunbal on Tuesday March 01 2016, @04:06AM
The thing about autonomous weapons is that you are going to want some sort of "off" switch. And the thing about that, see, is that if you can turn it off, I can figure out how to turn it off too.
(Score: 0) by Anonymous Coward on Tuesday March 01 2016, @08:06AM
Then I'll fit it with an "on" switch, and turn it on with that. :-P Phbbt!
(Score: 2) by JeanCroix on Thursday March 03 2016, @05:43PM