Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Monday October 31 2016, @06:16PM   Printer-friendly
from the explosions-killing-everybody-isn't-a-choice dept.

Researchers at MIT have put together a pictorial survey http://moralmachine.mit.edu/ -- if the self-driving car loses its brakes, should it go straight or turn? Various scenarios are presented with either occupants or pedestrians dying, and there are a variety of peds in the road from strollers to thieves, even pets.

This AC found that I quickly began to develop my own simplistic criteria and the decisions got easier the further I went in the survey.

While the survey is very much idealized, it may have just enough complexity to give some useful results?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by urza9814 on Tuesday November 01 2016, @06:15PM

    by urza9814 (3954) on Tuesday November 01 2016, @06:15PM (#421340) Journal

    I will never be in or allow anyone whose life I am responsible for to be in any vehicle where a machine that does not value my/their life/lives above all others is making decisions about who lives and who dies. If it comes to a decision between mowing down a dozen gradeschool kids or running my nephews into a tree, there had better be dead school kids scattered as far as the eye can see.

    So a bunch of innocent kids need to die because of YOUR DECISION to buy an autonomous car and send your nephew to school in it?

    The car should always prioritize those who had zero control over the situation above those who voluntarily put themselves at risk. The problem is the first car company to do otherwise forces all the rest to follow -- because I'm sure you aren't the only one who would never purchase a car that holds YOU responsible for your actions when it could put that risk onto others.

    http://www.smbc-comics.com/comic/self-driving-car-ethics [smbc-comics.com]

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by The Mighty Buzzard on Tuesday November 01 2016, @07:06PM

    by The Mighty Buzzard (18) Subscriber Badge <themightybuzzard@proton.me> on Tuesday November 01 2016, @07:06PM (#421354) Homepage Journal

    You make em do whatever you like. Those are simply my criteria for ever making use of one.

    And no, it's not immoral. You simply care less for yourself and those closest to you than you do for people you've never met. It's called altruism and it most certainly is immoral.

    --
    My rights don't end where your fear begins.