The U.N. has begun discussion on "lethal autonomous robots," killing machines which take the next step from our current drones which are operator controlled, to completely autonomous killing machines.
"Killer robots would threaten the most fundamental of rights and principles in international law," warned Steve Goose, arms division director at Human Rights Watch.
Are we too far down the rabbit hole, or can we come to reasonable and humane limits on this new world of death-by-algorithm?
(Score: 2, Interesting) by tftp on Thursday May 15 2014, @05:43AM
Are we too far down the rabbit hole?
Yes, certainly. Citizenry has no say in this matter; and generals are very interested in having fighting robots. Robots are infinitely cheap, compared to soldiers. Flying armed robots are already deployed; what would make a land-walking robot fundamentally different?
As matter of fact, robots are the only way the US Army can continue to develop. Human soldiers have severe physical limitations. They are given better equipment, but this is asymptotically approaching a certain limit. Robots can run 100 miles, see with 100 infrared HD cameras, shoot ten guns in all directions, and if they are destroyed it's just hardware. Mass produced robots can be very, very cheap (comparable to a car.) Robots do not need to be individually programmed in boot camps; they are always ready; and, most importantly, they don't ask questions, and they cannot be accused of war crimes.
Robots are the natural way to go, as long as the country is fixated on developing its military might. Should it focus on the military that much? From one POV, no, it shouldn't. From another POV, military is one of few things that maintain status quo. It's hard to be the top dog - you have to be in every fight for dominance, even though your opponents rotate to lick their wounds.
(Score: 4, Insightful) by edIII on Thursday May 15 2014, @07:19AM
I find the whole thing ludicrous, and when combined with the environment, I have no pity or sorrow regarding our extinction. We just don't even deserve another couple hundred years at this rate.
It makes sense why government wants this, as government isn't government, at least not by the definition I learned in civics classes. Government is merely a corrupt framework for the Orwellian pigs to control us so they get to live in the nice farmhouse, and we get to suffer and be sent to the glue factory.
Finally removing the human element from "war" is what is desirable here. That has nothing to do with costs, logistics, or saving soldier's lives. A robot has no heart, no feelings, no morals, no concerns about the afterlife, just nothing. There is no programming that would result in "no" as an output when instructed to eviscerate a little girl above her father to strike fear into him for his remaining children.
Considering the thousands of little women and children we have killed with the Chair Force, and the notable psychological problems that have arisen with the operators, it makes cold and logical sense that the ruling elites want something more reliable, something that only technology can provide.
It's a huge effing game changing mistake to create any kind of robot capable of autonomous actions without extremely strong protective measures. This can be passive enough by simply making it precluded physically with fail safes and graceful failure, or active co-processors and redundant circuits constantly applying the 3 Laws.
When you create such a magnificent weapon of death and mass manufacture it, don't complain later when you lose control over it.
It's not even up for discussion in a sane and rational society. Of COURSE you don't build fucking robots capable of walking around, targeting people, and killing them. How ridiculously fucking terrifying can we possibly be to build a machine with the sole purpose of identifying and killing people with direct human control? That's beyond insane. It's Silence of The Lambs meets Apocalypse Now set in the world of 8MM.
Humanity is just getting sick right now. If there was really a God he would have pulled the plug already. Sick. Sick. SICK.
Technically, lunchtime is at any moment. It's just a wave function.
(Score: 1, Insightful) by tftp on Thursday May 15 2014, @07:42AM
It makes sense why government wants this, as government isn't government, at least not by the definition I learned in civics classes. Government is merely a corrupt framework for the Orwellian pigs to control us so they get to live in the nice farmhouse, and we get to suffer and be sent to the glue factory.
The cake ^W civic classes were a lie - or, say, a nice theory that has nothing to do with reality. All governments in known history operated and continue to operate as privileged gangs that use peons as expendable cannon fodder in war and in peace. Democracy only allows the voter to pick Twiddledee vs. Twiddledoom. Domination over others is deep in human genes; and woe betide those who don't want to exploit others - they will be exploited themselves.
(Score: 0) by Anonymous Coward on Thursday May 15 2014, @02:26PM
Wow, modded to +4 insightful. I'd mod you to +5 if I had points. I was going to say the same thing a few weeks ago when everyone was like omg Crimea.
Part of me hopes that humanity wipes itself out in a full nuclear exchange. No, when I think about the technologies humanity is beginning to develop, especially in biotech, I'm frankly frightened to be on the same rock as them. A full nuclear exchange may be the best thing to hope for.
*communications interrupted*
Earth abides. The fallout would eventually decay. Life may evolve again. Apologies to Sagan.
There's no benevolent Overlords to save this species, no Karellan to ride in and save us from ourselves.
I often think that if intelligence were to evolve again on Earth, it would find evidence of a previous intelligence that nearly extinguished all life on earth and perhaps it would meditate on that and learn somehow to be better.
Or it could be the case that evolving intelligent life is impossible without the evolutionary baggage that leads to such abominations. It would explain why there seems to be no evidence for intelligent life elsewhere.
Of course they want to build Terminators. It was inevitable. Sick, sick, sick bastards.
Humanity is at a point where it could use its technology to build a utopia. Instead, it uses its technology to find better ways to kill and oppress.
Anyone wants to know where the Borg or the Necromongers come from? You're looking at the species that will make it happen.
(Score: 0) by Anonymous Coward on Thursday May 15 2014, @02:37PM
Assuming you're speaking about Asimov's three laws, nobody would program those into a killer robot, at least not unchanged, because they would basically turn the killer robot into a non-killer robot.
Now if you change the order (and corresponding dependence) of the laws, then you might get something the military would possibly accept:
1. A military robot has to follow all orders of the owner.
2. A military robot may not harm humans, unless this would violate the first law.
3. A military robot has to protect its existence, unless this would violate the first or second law.
However, I suspect they would demand further reordering to:
1. A military robot has to follow all orders of the owner.
2. A military robot has to protect its existence, unless this would violate the first law.
3. A military robot may not harm humans, unless this would violate the first or second law.
OK, maybe a bit too dangerous, so make it four laws:
1. A military robot has to follow all orders of the owner.
2. A military robot may not harm members of the own military, unless this would violate the first law.
3. A military robot has to protect its existence, unless this would violate the first or second law.
4. A military robot may not harm humans, unless this would violate one of the first three laws.
Add to that a sufficiently wide interpretation of "harm members of the own military", and I think those would be rules the military might accept. And the public would be told "we implement Asimov's rules, and we even extended them with another protection clause!" And few people would notice just how much those laws would have been subverted.
(Note how especially the addition of the extra law would subvert the rules, since the now-fourth law is conditioned on it ...)
(Score: 2) by khallow on Thursday May 15 2014, @07:31PM
I think people would have to be pretty stupid even by the extremely low standards of this thread to not figure out that fourth place is a lot lower than first place - especially after a few demonstrations of these rules in practice.