Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Monday October 31 2016, @06:16PM   Printer-friendly
from the explosions-killing-everybody-isn't-a-choice dept.

Researchers at MIT have put together a pictorial survey http://moralmachine.mit.edu/ -- if the self-driving car loses its brakes, should it go straight or turn? Various scenarios are presented with either occupants or pedestrians dying, and there are a variety of peds in the road from strollers to thieves, even pets.

This AC found that I quickly began to develop my own simplistic criteria and the decisions got easier the further I went in the survey.

While the survey is very much idealized, it may have just enough complexity to give some useful results?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by bob_super on Monday October 31 2016, @06:39PM

    by bob_super (1357) on Monday October 31 2016, @06:39PM (#420968)

    If the self-driving car fails to diagnose impending brake failure, fails to operate redundant parking or engine braking (elec or gas), happens to be in a situation where coasting to a stop is impossible, happens to be by itself and not able to communicate with another automatic car to engage procedure 73.4 "the rescue rear-end", happens to not have any ditch, parked cars, urban furniture or trees to hit to get to a stop, is not equipped with instant tire deflators or a Turbo Boost button, AND happens to find on its path a soft target which does not have time to react to honking and flashing headlights, THEN the car should connect to the appropriate lottery agency and get a couple tickets to help pay for the lawyers who will have to figure out who's responsible in this clusterfuck of Hollywood-level coincidences...

    Starting Score:    1  point
    Moderation   +3  
       Interesting=2, Funny=1, Disagree=1, Total=4
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 5, Interesting) by AthanasiusKircher on Monday October 31 2016, @08:58PM

    by AthanasiusKircher (5291) on Monday October 31 2016, @08:58PM (#421045) Journal

    THEN the car should connect to the appropriate lottery agency and get a couple tickets to help pay for the lawyers who will have to figure out who's responsible in this clusterfuck of Hollywood-level coincidences...

    Who could be responsible in your scenario? Uhh, the pedestrian??

    Seriously, pedestrians are frequently found where they "shouldn't be." That's a factor in many (perhaps most) pedestrian fatalities in accidents. Heck, the NHTSA reports [dot.gov] that in roughly 1/3 of pedestrian fatalities, the pedestrian was drunk. And that's just for the 4000-5000 pedestrians who are actually killed in crashes each year. The CDC estimates [cdc.gov] more than 150,000 emergency room visits for non-fatal injuries to pedestrians each year.

    Even if you can blame a significant number of those on drunk drivers, distracted drivers, poor drivers, etc., there are also going to be thousands of cases each year where the pedestrian was basically in an unexpected or dangerous place and got hit. The kid ran out into the street after the ball. The old man couldn't walk fast enough and was wearing a dark-colored coat. Etc.

    Thus, assuming self-driving cars become popular, they WILL hit pedestrians sometimes, unless we manage to ban all pedestrians from all roads at all times (which seems unlikely).

    I agree with you that the VAST MAJORITY of these are probably not "trolley problem" circumstances, because in most cases where drivers can reasonably swerve out of the way and avoid the pedestrian, they probably do. They don't need to worry about whether they are going to hit an old lady in the next lane instead of the drunken idiot in front of you, because those choices rarely occur in the real world. And in many other cases, there simply isn't time to maneuver to avoid the pedestrian, so the "trolley problem" scenarios can't apply.

    Everyone tends to worry about these sorts of liability scenarios -- should I hit the flock of schoolkids or swerve to plow down the old man? But I think this overlooks the much more obvious liability questions which are FAR more likely to end up in court or at least with media attention, namely -- "Could the self-driving car have done SOMETHING else?" With human drivers, we accept a certain level of error. There are still lawsuits, but that's why drivers are so rarely charged with hitting a pedestrian -- in a lot of cases the collision was unavoidable based on what the pedestrian was doing.

    But I predict we won't be as forgiving of "evil robot" cars. All it will take is one tragedy where a couple kids get run over and you have the passenger in the car saying, "God, I'm so sorry! If only I were driving, I'm sure I could have swerved!" It doesn't matter if no human could reasonably have prevented it. Suddenly we'll be treated with headlines of "Evil Robot Car Kills Kids!" with a social media drumbeat to ban them and Senators calling for inquiries and regulation, etc., etc.

    THAT is the scenario the self-driving car companies should be worried about. Not some bizarre contrived ethics problem debated by philosophy graduate students. For those of you who are happy to say, "Aren't the cars ready if they're just better than the average human driver?" The answer is NO. Because when a tragedy strikes -- and it WILL if these cars become common at all -- the lawyers and the media and Congressional inquiry committee won't be comparing the car to the "average human driver." That's just the fact of living in a litigious culture with a sensationalist media.

    • (Score: 1) by khallow on Monday October 31 2016, @10:56PM

      by khallow (3766) Subscriber Badge on Monday October 31 2016, @10:56PM (#421077) Journal

      But I predict we won't be as forgiving of "evil robot" cars. All it will take is one tragedy where a couple kids get run over and you have the passenger in the car saying, "God, I'm so sorry! If only I were driving, I'm sure I could have swerved!" It doesn't matter if no human could reasonably have prevented it. Suddenly we'll be treated with headlines of "Evil Robot Car Kills Kids!" with a social media drumbeat to ban them and Senators calling for inquiries and regulation, etc., etc.

      Well put. I do think that after a while, the evil robot car syndrome will go away as people get used to the technology and such accidents will fade into the background noise of death that we accept. But until then, I think you'll be right about this scenario happening multiple times.

    • (Score: 2) by maxwell demon on Tuesday November 01 2016, @05:59AM

      by maxwell demon (1608) on Tuesday November 01 2016, @05:59AM (#421148) Journal

      Who could be responsible in your scenario?

      Let's analyze it:

      If the self-driving car fails to diagnose impending brake failure,
      If the impending brake failure would have been diagnosable, then it might be the manufacturer for not properly implementing the check. If the brake failure is due to lack of maintenance, it also might be the owner'sfault for neglecting proper maintenance.

      fails to operate redundant parking or engine braking (elec or gas),
      Again, that would be the manufacturer being responsible, for not programming in the use of that option.

      happens to be in a situation where coasting to a stop is impossible,
      For that, there might be none responsible; of course that cannot be decided without knowing the details of the situation.

      happens to be by itself and not able to communicate with another automatic car to engage procedure 73.4 "the rescue rear-end", happens to not have any ditch, parked cars, urban furniture or trees to hit to get to a stop,
      Nobody's failure in that, obviously.

      is not equipped with instant tire deflators or a Turbo Boost button,
      Might be considered the regulator's fault for not requiring this.

      AND happens to find on its path a soft target which does not have time to react to honking and flashing headlights,
      Again, without knowing the situation it's hard to tell, but most likely it's the fault of either the car manufacturer (if that situation would have been avoidable by more appropriate driving procedures — e.g. by going at appropriate speed), or the soft target (suddenly entering the street closely in front of the car).

      Note also that depending on the exact situation and legal situation and the extend to which the driver is able to override the car's decisions, anything that is a bad decision by the car might also be at the same time a bad decision by the driver who didn't override the car when it obviously did the wrong decision.

      Anyway, there are enough "depends on the situation" here that a lawsuit is unavoidable.

      --
      The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2) by darnkitten on Tuesday November 01 2016, @05:45PM

      by darnkitten (1912) on Tuesday November 01 2016, @05:45PM (#421328)

      Seriously, pedestrians are frequently found where they "shouldn't be." That's a factor in many (perhaps most) pedestrian fatalities in accidents. Heck, the NHTSA reports that in roughly 1/3 of pedestrian fatalities, the pedestrian was drunk. And that's just for the 4000-5000 pedestrians who are actually killed in crashes each year. The CDC estimates more than 150,000 emergency room visits for non-fatal injuries to pedestrians each year.

      This podcast on the history of pedestrians vs. automobiles: [99percentinvisible.org] got me thinking about assumptions I had made.