Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Monday October 31 2016, @06:16PM   Printer-friendly
from the explosions-killing-everybody-isn't-a-choice dept.

Researchers at MIT have put together a pictorial survey http://moralmachine.mit.edu/ -- if the self-driving car loses its brakes, should it go straight or turn? Various scenarios are presented with either occupants or pedestrians dying, and there are a variety of peds in the road from strollers to thieves, even pets.

This AC found that I quickly began to develop my own simplistic criteria and the decisions got easier the further I went in the survey.

While the survey is very much idealized, it may have just enough complexity to give some useful results?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday October 31 2016, @09:47PM

    by Anonymous Coward on Monday October 31 2016, @09:47PM (#421064)

    This is pointless pontification, because essentially all of these scenarios assume omnipotence. From the standpoint of a control system, the car is never going to be sure it's a pedestrian vs. a deer...

    I believe you are missing the point. The car will be judged on what it does with what it knows (believes) at the time. If it mistakes a deer for a human, a jury may hold the company less culpable then if it kills the driver to avoid a deer, guessing correctly it's a deer. (A jury would probably expect the passenger's life gets priority over a deer.) Misidentification is typically given more leeway than bad choices based on what's believed to be true. (Assuming it's considered a reasonable mistake, such as one a human could make also.)

    Whether that's fair or good is another matter, it's how humans tend to judge, and what the car co will be subject to when disputes happen. The car co wants to protect their legal tail by making the INTENDED rules reflect what a jury would likely expect.

    It's somewhat comparable to Hillary getting off the hook because she didn't knowingly transmit classified info (i.e., no intent was proven). Her mental (mis) classification of the material made a difference from a legal standpoint.