Researchers at MIT have put together a pictorial survey http://moralmachine.mit.edu/ -- if the self-driving car loses its brakes, should it go straight or turn? Various scenarios are presented with either occupants or pedestrians dying, and there are a variety of peds in the road from strollers to thieves, even pets.
This AC found that I quickly began to develop my own simplistic criteria and the decisions got easier the further I went in the survey.
While the survey is very much idealized, it may have just enough complexity to give some useful results?
(Score: 2) by urza9814 on Tuesday November 01 2016, @06:15PM
So a bunch of innocent kids need to die because of YOUR DECISION to buy an autonomous car and send your nephew to school in it?
The car should always prioritize those who had zero control over the situation above those who voluntarily put themselves at risk. The problem is the first car company to do otherwise forces all the rest to follow -- because I'm sure you aren't the only one who would never purchase a car that holds YOU responsible for your actions when it could put that risk onto others.
http://www.smbc-comics.com/comic/self-driving-car-ethics [smbc-comics.com]
(Score: 2) by The Mighty Buzzard on Tuesday November 01 2016, @07:06PM
You make em do whatever you like. Those are simply my criteria for ever making use of one.
And no, it's not immoral. You simply care less for yourself and those closest to you than you do for people you've never met. It's called altruism and it most certainly is immoral.
My rights don't end where your fear begins.