Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by cmn32480 on Tuesday August 18 2015, @06:23AM   Printer-friendly
from the skynet-is-beginning dept.

Opposition to the creation of autonomous robot weapons have been the subject of discussion here recently. The New York Times has added another voice to the chorus with this article:

The specter of autonomous weapons may evoke images of killer robots, but most applications are likely to be decidedly more pedestrian. Indeed, while there are certainly risks involved, the potential benefits of artificial intelligence on the battlefield — to soldiers, civilians and global stability — are also significant.

The authors of the letter liken A.I.-based weapons to chemical and biological munitions, space-based nuclear missiles and blinding lasers. But this comparison doesn't stand up under scrutiny. However high-tech those systems are in design, in their application they are "dumb" — and, particularly in the case of chemical and biological weapons, impossible to control once deployed.

A.I.-based weapons, in contrast, offer the possibility of selectively sparing the lives of noncombatants, limiting their use to precise geographical boundaries or times, or ceasing operation upon command (or the lack of a command to continue).

Personally, I dislike the idea of using AI in weapons to make targeting decisions. I would hate to have to argue with a smart bomb to try to convince it that it should not carry out what it thinks is is mission because of an error.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by FatPhil on Tuesday August 18 2015, @07:43AM

    by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Tuesday August 18 2015, @07:43AM (#224293) Homepage
    Street gangs, what's the harm? people, people are harmed. yup, but most of the people who are harmed are in street gangs. Self-solving problem.

    If these things are going to malfunction, they'll malfunction most often where they are most often found, which is while the "good" guys are working on them. Self-solving problem.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 4, Insightful) by pkrasimirov on Tuesday August 18 2015, @08:26AM

    by pkrasimirov (3358) Subscriber Badge on Tuesday August 18 2015, @08:26AM (#224305)

    Street gangs won't pull the trigger to a baby. There is less empathy in Africa but still there is always some.

    Drones are killing machines. Literally. Targets are discriminated, true, but everybody is a target, just with different priority. If you are in sight you are on the list.

  • (Score: 0) by Anonymous Coward on Tuesday August 18 2015, @08:29AM

    by Anonymous Coward on Tuesday August 18 2015, @08:29AM (#224307)

    > Street gangs, what's the harm? people, people are harmed. yup, but most of the people who are harmed are in street gangs. Self-solving problem.

    Which is why there are no street gangs any more, the problem totally solved itself.

    > If these things are going to malfunction,

    No serious objection is about malfunctions. The problem with these system is that they will function exactly as designed.