The technology is new, but the moral conundrum isn't: A self-driving car identifies a group of children running into the road. There is no time to stop. To swerve around them would drive the car into a speeding truck on one side or over a cliff on the other, bringing certain death to anybody inside.
To anyone pushing for a future for autonomous cars, this question has become the elephant in the room, argued over incessantly by lawyers, regulators, and ethicists; it has even been at the center of a human study by Science. Happy to have their names kept in the background of the life-or-death drama, most carmakers have let Google take the lead while making passing reference to ongoing research, investigations, or discussions.
But not Mercedes-Benz. Not anymore.
The world's oldest car maker no longer sees the problem, similar to the question from 1967 known as the Trolley Problem, as unanswerable. Rather than tying itself into moral and ethical knots in a crisis, Mercedes-Benz simply intends to program its self-driving cars to save the people inside the car. Every time.
Is it really a decision based on morality, or because choosing to save the pedestrians is much harder to code?
(Score: 0) by Anonymous Coward on Wednesday October 12 2016, @12:45PM
There's a difference between "break" and "brake". You might want to learn that difference, because your post doesn't say what you probably meant it to say.
(Score: 2) by hendrikboom on Thursday October 13 2016, @12:56PM
Yes, if you don't brake your car may break.