Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday October 12 2016, @05:43AM   Printer-friendly
from the no-more-heroes dept.

The technology is new, but the moral conundrum isn't: A self-driving car identifies a group of children running into the road. There is no time to stop. To swerve around them would drive the car into a speeding truck on one side or over a cliff on the other, bringing certain death to anybody inside.

To anyone pushing for a future for autonomous cars, this question has become the elephant in the room, argued over incessantly by lawyers, regulators, and ethicists; it has even been at the center of a human study by Science. Happy to have their names kept in the background of the life-or-death drama, most carmakers have let Google take the lead while making passing reference to ongoing research, investigations, or discussions.

But not Mercedes-Benz. Not anymore.

The world's oldest car maker no longer sees the problem, similar to the question from 1967 known as the Trolley Problem, as unanswerable. Rather than tying itself into moral and ethical knots in a crisis, Mercedes-Benz simply intends to program its self-driving cars to save the people inside the car. Every time.

Is it really a decision based on morality, or because choosing to save the pedestrians is much harder to code?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by cccc828 on Wednesday October 12 2016, @07:07AM

    by cccc828 (2368) on Wednesday October 12 2016, @07:07AM (#413325)

    The problem is not that you cannot solve the trolley problem (you can weigh lives in numerous ways), the problem, at least in Germany, is that you are not allowed to weigh one live against another. Not even one vs. a million. See also Luftsicherheitsgesetz [wikipedia.org].

    The moment an engineer programs a car to prioritize one life against the other, she/he is potentially liable for the harm caused. That is, by the way, also how it is handled (in Germany) for humans: If you face the trolley problem (however you move your car, you will hit someone), you broke the law and are responsible for the other person's injuries/death. However, the law recognizes that you were in an "emergency situation" and thus you are "excused" and there is no punishment.

    The problem the engineers now face is, that they are not in such an "emergency situation". They are not under stress, they do not face a risk to their life etc. Thus, no traditional reason for an "excuse" applies. In some legal view they are thus liable.

    This issue is hotly debated right now - there is currently no 'legal consensus' . It is also unclear if a pedestrian is "innocent" or if she/he "agrees" to potential harm by being on the streets. Another thought experiment: A car can avoid a fatal crash by moving off the road and going through a hedge. The car does not know what is on the other side of that hedge. There is some probability that there are kids playing behind the hedge/ a person just relaxing in the sun/etc. Is it ok to (potentially) kill someone who does not even participate in the traffic? Where do you draw the line?

    Let's see who is faster: the lawmakers or the courts after an accident...

    Starting Score:    1  point
    Moderation   +4  
       Interesting=3, Informative=1, Total=4
    Extra 'Interesting' Modifier   0  

    Total Score:   5  
  • (Score: 0) by Anonymous Coward on Wednesday October 12 2016, @08:32AM

    by Anonymous Coward on Wednesday October 12 2016, @08:32AM (#413346)

    Thank you for the citation. It is interesting and relevant as Mercedes-Benz is based in Germany. It is also insane. We must allow many innocent people to die, in order to protect the lives of other people who are also going to die. I guess this is what happens when you dedicate a concept as abstract, vague, and subjective as dignity as the most important principle of the state. I am glad I do not live in Germany.

    I guess the cars (or at least the software) will be available for export only, to places where actual lives are valued above philosophical debate.

    • (Score: 2, Informative) by cccc828 on Wednesday October 12 2016, @08:52AM

      by cccc828 (2368) on Wednesday October 12 2016, @08:52AM (#413349)

      As a foreigner living in Germany I think it is not the worst of places to live.

      You have to understand the German laws (especially the fundamental ones) in light of the horrors of the NSDAP regime. Back then the needs of the many ("Aryans") outweighed the need of the few ("Untermenschen" roughly "sub-humans"). The guiding principle in Germany is to never again repeat the mistakes from that dark chapter of German history.

      That is also why the court reached that verdict. How to assign a value/utility/probability to die/etc. to a life without laying for the foundations of for the horrors of 1933? It makes sense to rule that way and pass the buck to the parliament and the public to come to a conclusion and write laws to reflect that conclusion.

      • (Score: 0) by Anonymous Coward on Wednesday October 12 2016, @09:28AM

        by Anonymous Coward on Wednesday October 12 2016, @09:28AM (#413365)

        While I appreciate the Germans' need to be conservative about anything that might lead to repeats of the Nazi regime, the reality is that you must, when in a position of policymaking or safety engineering, always weigh lives. Whether against other lives, political costs, financial costs, or whatever, it must and will happen. When death is your opponent, you cannot refuse to play the game.

        • (Score: 2, Interesting) by cccc828 on Wednesday October 12 2016, @11:11AM

          by cccc828 (2368) on Wednesday October 12 2016, @11:11AM (#413391)

          I agree. There is no question that these problems must be solved for autonomous systems, the discussion currently is how to best solve them. You could for example treat them like vaccines (they might cause side effects and complications), power plants (their exhaust kill/reduce the life span of random people every year), pets, children or even look to ancient slavery laws for inspiration (I am not kidding). These options make for lively discussions amongst legal scholars :)

  • (Score: 2) by JeanCroix on Wednesday October 12 2016, @03:53PM

    by JeanCroix (573) on Wednesday October 12 2016, @03:53PM (#413491)

    ...Luftsicherheitsgesetz.

    Gesundheit.