Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday November 01 2019, @09:37AM   Printer-friendly
from the the-buck-stops-here dept.

Tonight, drivers in the US will kill more pedestrians than any other night of the year. An increase in people walking in low-light conditions makes Halloween the most dangerous night of the year for pedestrians.
[...]
In "bad intervention" scenarios, the main driver (either human or machine) makes a driving decision that would avoid hitting the pedestrian, but the secondary driver intervenes with the wrong call, resulting in a collision. In bad interventions, it makes sense that the secondary driver is really the one to blame, since they overrode the correct actions of the primary driver.

This expectation matches how people reacted. When participants saw this scenario and rated how blameworthy each driver was and how much they caused the death, on a scale of 1 to 100, the secondary driver came out bearing most of the blame. This was true whether or not the secondary driver was a human or machine.

In "missed intervention" scenarios, though, things looked a little different. In these scenarios, the main driver is the one who makes the wrong call, but the secondary driver doesn't intervene to rescue the situation. In these scenarios, both drivers made an error.

Participants did apportion some blame to both drivers in these scenarios—but the human took more blame than the car.

https://arstechnica.com/science/2019/10/humans-take-more-blame-than-cars-for-killing-pedestrians/
Nature Human Behaviour, 2018. DOI: 10.1038/s41562-019-0762-8 http://dx.doi.org/10.1038/s41562-019-0762-8


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Immerman on Friday November 01 2019, @01:36PM

    by Immerman (3985) on Friday November 01 2019, @01:36PM (#914558)

    And humans *should* take more blame. *You* are operating the car - if you weren't involved, the car would be safely parked, that makes its operation your responsibility. If you choose to hand that responsibility over to a machine... it's ultimately still your responsibility because the machine *can't* take responsibility. (especially if it's inadequate to the job - and I haven't yet heard of any auto-drive systems that makes a credible claim as to being adequate as the primary driver on residential streets)

    Now, if the machine intervenes in a way that prevents you from doing the right thing - then yes, it's the machine's (which is to say the manufacturer's) fault - they sold it as being suitable to be driven safely when it obviously wasn't. But if you then continue to drive the car, knowing it can't be reliably driven safely, then the moral responsibility is back on you. (legal responsibility should probably remain with the manufacturer, barring a recall)

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4