Researchers at MIT have put together a pictorial survey http://moralmachine.mit.edu/ -- if the self-driving car loses its brakes, should it go straight or turn? Various scenarios are presented with either occupants or pedestrians dying, and there are a variety of peds in the road from strollers to thieves, even pets.
This AC found that I quickly began to develop my own simplistic criteria and the decisions got easier the further I went in the survey.
While the survey is very much idealized, it may have just enough complexity to give some useful results?
(Score: 0) by Anonymous Coward on Monday October 31 2016, @09:47PM
I believe you are missing the point. The car will be judged on what it does with what it knows (believes) at the time. If it mistakes a deer for a human, a jury may hold the company less culpable then if it kills the driver to avoid a deer, guessing correctly it's a deer. (A jury would probably expect the passenger's life gets priority over a deer.) Misidentification is typically given more leeway than bad choices based on what's believed to be true. (Assuming it's considered a reasonable mistake, such as one a human could make also.)
Whether that's fair or good is another matter, it's how humans tend to judge, and what the car co will be subject to when disputes happen. The car co wants to protect their legal tail by making the INTENDED rules reflect what a jury would likely expect.
It's somewhat comparable to Hillary getting off the hook because she didn't knowingly transmit classified info (i.e., no intent was proven). Her mental (mis) classification of the material made a difference from a legal standpoint.