Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Thursday March 22 2018, @12:34PM   Printer-friendly
from the carmagedon dept.

A few Soylentils wrote in to tell us about a fatal accident between a pedestrian and an autonomous Uber vehicle.

Update - Video Released of Fatal Uber - Pedestrian Accident

I debated just replying to the original story, but this seemed a pretty significant update to me:

The Uber vehicle was operating in autonomous mode when it crashed into 49-year-old Elaine Herzberg on Sunday evening. Herzberg was transported to a hospital, where she later died from her injuries, in what may be the first known pedestrian fatality in a self-driving crash.

The video footage does not conclusively show who is at fault. Tempe police initially reported that Herzberg appeared suddenly; however, the video footage seems to show her coming into view a number of seconds before the crash. It also showed the vehicle operator behind the wheel intermittently looking down while the car was driving itself.

The link shows video of the seconds just before the accident.

The pedestrian did not step out in front of the vehicle, she was essentially out in the middle of the road, and all her lateral movement was nearly irrelevant. She might as well have been a stationary object in the middle of the road. You can see the headlights bring her feet into view first, (meaning she was pretty much in the line before the headlights could see her, and then move up her body; she's already in the middle of the road in front of him when she comes into view.

If I were driving that car, I think I'd have had time to hit brakes (but not stop in time). I also think that that if the camera view is an accurate representation of what was really visible, then the car was overdriving its headlights. Although given my experience with cameras, I wouldn't be surprised if actual visibility was better than what the video shows.

This, in my opinion, is pretty damning.

Police Chief: Uber Self-Driving Car "Likely" Not At Fault In Fatal Crash

The chief of the Tempe Police has told the San Francisco Chronicle that Uber is likely not responsible for the Sunday evening crash that killed 49-year-old pedestrian Elaine Herzberg. “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident," said chief Sylvia Moir.

Herzberg was "pushing a bicycle laden with plastic shopping bags," according to the Chronicle's Carolyn Said, when she "abruptly walked from a center median into a lane of traffic."

After viewing video captured by the Uber vehicle, Moir concluded that “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." Moir added that "it is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available."

Self-Driving Car Testing Likely to Continue Unobstructed

Self-Driving Cars Keep Rolling Despite Uber Crash

The death of a woman who was struck by a self-driving Uber in Arizona on Sunday has auto-safety advocates demanding that U.S. regulators and lawmakers slow down the rush to bring autonomous vehicles to the nation's roadways. Don't count on it.

Efforts to streamline regulations to accommodate the emerging technology have been under way since the Obama administration with strong bipartisan support. And the Trump administration's aversion to restrictions and regulations makes it even more unlikely that the accident in Tempe, Arizona, in which an autonomous Uber sport utility vehicle struck and killed a pedestrian, will result in significant new barriers, according to former U.S. officials and some safety advocates.

"Honestly, the last thing under this administration that car companies and self-driving vehicle developers have to worry about is heavy regulation," said David Friedman, a former National Highway Traffic Safety Administration administrator under President Barack Obama who's now director of cars and product policy for Consumers Union.

Who is to blame when driverless cars have an accident?

[Partial] or full autonomy raises the question of who is to blame in the case of an accident involving a self-driving car? In conventional (human-driven) cars, the answer is simple: the driver is responsible because they are in control. When it comes to autonomous vehicles, it isn't so clear cut. We propose a blockchain-based framework that uses sensor data to ascertain liability in accidents involving self-driving cars.


Original Submission #1Original Submission #2Original Submission #3Original Submission #4

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Rosco P. Coltrane on Thursday March 22 2018, @01:52PM (16 children)

    by Rosco P. Coltrane (4757) on Thursday March 22 2018, @01:52PM (#656589)

    - A self-driving car that apparently didn't even try to slam the brakes at the last moment - indicating that it didn't even register the pedestrian at the last moment when it came into the headlights and was plain to see in the video,

    - A fat sack of shit behind the wheel who wasn't looking ahead and didn't try to slam the brakes either

    Not saying the accident was avoidable - clearly the pedestrian became visible much to late to be avoided. Yet at least the computer or the human backup driver should have tried to do something about it. I think a fully-aware driver paying attention to the road in a manual car would have.

    So I think this is a sign of things to come: fallible human drivers being replaced by even more fallible computer drivers and humans who are supposed to come to their help if they cock up, but really become complacent and don't give two fucks.

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 4, Touché) by FakeBeldin on Thursday March 22 2018, @02:11PM

    by FakeBeldin (3360) on Thursday March 22 2018, @02:11PM (#656595) Journal

    In all fairness, your nick disqualifies you from commenting on this story.
    :)

  • (Score: 2, Insightful) by nitehawk214 on Thursday March 22 2018, @02:53PM (2 children)

    by nitehawk214 (1304) on Thursday March 22 2018, @02:53PM (#656618)

    I think this proves that the human "driver" that doesn't need to be an active part of the trip in 99.99% of cases, will not be paying attention when that .01% situation comes up.

    --
    "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
    • (Score: 3, Insightful) by hemocyanin on Thursday March 22 2018, @07:58PM (1 child)

      by hemocyanin (186) on Thursday March 22 2018, @07:58PM (#656832) Journal

      To sit and do nothing but monitor the vehicle and the exterior for hours on end is certainly some kind of horrific torture. I can't imagine anyone being able to do that job effectively.

      • (Score: 1) by nitehawk214 on Thursday March 22 2018, @08:19PM

        by nitehawk214 (1304) on Thursday March 22 2018, @08:19PM (#656836)

        I work near places that test self driving cars (Argo and Uber), enough that I see them every day. They all have tinted windows so you can't tell if the driver is paying attention or in direct control of the vehicle or not.

        Originally the rule here was that there had to be 2 people in the car. One "driver", and the other an engineer that is doing testing, etc. But I know I have seen the Argos with only one person in the car. Its possible they are just driving around to collect data and the "driver" is just "driving", but I have doubts.

        Though I bet that it is the computer driving when the car is going the speed limit. They would be the only vehicles on the road going the speed limit on this street. They seem decently safe, I guess.

        I missed on on this event, though: https://techcrunch.com/2018/01/10/argo-ai-self-driving-test-car-hit-in-pittsburgh-as-truck-runs-red-light/ [techcrunch.com]

        All I really want to see is an Uber and an Argo in a traffic accident together.

        --
        "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
  • (Score: 2) by Snotnose on Thursday March 22 2018, @03:13PM (11 children)

    by Snotnose (1623) on Thursday March 22 2018, @03:13PM (#656630)

    - A self-driving car that apparently didn't even try to slam the brakes at the last moment - indicating that it didn't even register the pedestrian at the last moment when it came into the headlights and was plain to see in the video,

    Yeah, this is a problem. I'm pretty sure that, like the mysterious plane crashes over the last few decades, they'll figure out what went wrong and fix it so it doesn't happen again. To paraphrase Elon Musk "this one will probably explode too, just in a different way".

    - A fat sack of shit behind the wheel who wasn't looking ahead and didn't try to slam the brakes either

    Yeah, let's see how you do stuck in a seat and told to watch the road at all times. You aren't driving, probably don't even have music, yet you have to maintain constant alertness when nothing is happening and you aren't doing anything.

    Not saying the accident was avoidable - clearly the pedestrian became visible much to late to be avoided.

    This was 100% the pedestrian's fault. We see these bozos all the time. They have the right of way so they just walk in front of moving cars, figuring the car will stop/swerve, and go about their day. This video shows exactly what happens when the car doesn't see you or doesn't react soon enough.

    The message is, pedestrians need to learn to look for farking cars before walking into the road.

    --
    When the dust settled America realized it was saved by a porn star.
    • (Score: 2) by Rosco P. Coltrane on Thursday March 22 2018, @03:24PM (3 children)

      by Rosco P. Coltrane (4757) on Thursday March 22 2018, @03:24PM (#656632)

      Yeah, let's see how you do stuck in a seat and told to watch the road at all times

      She's fucking *paid* to do that because it's her fucking job. If she found it so boring she couldn't help playing with her cellphone or whatever, she should've changed job.

      • (Score: 2) by Snotnose on Thursday March 22 2018, @04:44PM (2 children)

        by Snotnose (1623) on Thursday March 22 2018, @04:44PM (#656677)

        She's fucking *paid* to do that because it's her fucking job. If she found it so boring she couldn't help playing with her cellphone or whatever, she should've changed job.

        How long to folks in the military have to stand guard duty at night when nothing is happening? An hour? Two?

        It's a well known fact that people get bored when they have nothing to do, and when they get bored their attention wanders. Doesn't matter how well you pay them, only a true ADHD could maintain constant alertness for hours at a time when nothing is happening.

        --
        When the dust settled America realized it was saved by a porn star.
        • (Score: 3, Insightful) by Osamabobama on Thursday March 22 2018, @05:21PM

          by Osamabobama (5842) on Thursday March 22 2018, @05:21PM (#656711)

          That's why they keep logs:

          0100 All secure
          0130 All secure
          0200 All secure
          0300 All secure
          *0310 Late entry: 0230 All secure
          0330 All secure
          0400 Properly relieved by SN Shmuckatelli.
          0400 I, SN Shmuckatelli, have assumed the watch.
          0410 Russian submarine sighted in Lake Baldwin. CDO notified.

          Anyway, if the safety observer had been providing continuous feedback on the car's driving, as one might provide a teenager, the car might have learned something and the observer would have remained more alert.

          --
          Appended to the end of comments you post. Max: 120 chars.
        • (Score: 0) by Anonymous Coward on Friday March 23 2018, @11:25AM

          by Anonymous Coward on Friday March 23 2018, @11:25AM (#657087)

          A bit off topic, but ADHD people CAN'T maintain focus. The first part of ADHD is "Attention Deficit".

    • (Score: 2) by Knowledge Troll on Thursday March 22 2018, @03:48PM (2 children)

      by Knowledge Troll (5948) on Thursday March 22 2018, @03:48PM (#656643) Homepage Journal

      Yeah, let's see how you do stuck in a seat and told to watch the road at all times. You aren't driving, probably don't even have music, yet you have to maintain constant alertness when nothing is happening and you aren't doing anything.

      Great! The human safety monitors can't possibly do their job so the safety requirements of testing the robot car can't be met as has been just proven. Time to stop all testing until that pesky safety thing can be worked out. Maybe we should evaluate all other level 3 and 4 cars as well since they assume this kind of supervision is possible.

      This was 100% the pedestrian's fault. We see these bozos all the time. They have the right of way so they just walk in front of moving

      It is not possible to have the right of way and be at fault. Having right of way means exactly you are not at fault.

      • (Score: 4, Insightful) by linuxrocks123 on Thursday March 22 2018, @10:34PM (1 child)

        by linuxrocks123 (2557) on Thursday March 22 2018, @10:34PM (#656895) Journal

        Nice try, but the point isn't that the human monitor will be perfect, just better than nothing. I don't know what exactly you have against self-driving cars, or why you want to stop them through fearmongering, but it's clear that's what you're trying to do.

        This is a tragic incident, and I hope that both Uber and the Arizona regulators take serious precautions to make sure similar situations are handled better. I also hope people like you don't manage to use this tragedy to halt what promises to be this century's most significant advance in both automotive safety and automotive convenience.

    • (Score: 2) by Sarasani on Thursday March 22 2018, @04:07PM

      by Sarasani (3283) on Thursday March 22 2018, @04:07PM (#656650)

      The message is, pedestrians need to learn to look for farking cars before walking into the road.

      That's a tough lesson to learn for a child chasing a ball across the road. See my other comment [soylentnews.org] for how some countries deal with this conundrum.

      Mind you: obviously I'm not suggesting that pedestrians should not be careful.

    • (Score: 2) by Arik on Friday March 23 2018, @01:24AM (2 children)

      by Arik (4543) on Friday March 23 2018, @01:24AM (#656963) Journal
      "To paraphrase Elon Musk "this one will probably explode too, just in a different way"."

      But that's the attitude and the methodology follows that attitude. So yeah, they'll find this bug, in a very specific, very localized way, and fix it, surely.

      But will they step back and allow themselves to realize that their whole design and development methodology are wildly inappropriate? That's much less likely to happen.

      And if it doesn't, then there will be plenty more bugs to follow.
      --
      If laughter is the best medicine, who are the best doctors?
      • (Score: 2) by Snotnose on Friday March 23 2018, @04:48AM (1 child)

        by Snotnose (1623) on Friday March 23 2018, @04:48AM (#657024)

        But will they step back and allow themselves to realize that their whole design and development methodology are wildly inappropriate? That's much less likely to happen.

        Doesn't matter. Fact is, 99% of human drivers would have killed that woman. Yeah, you can poke holes into the tech. Yeah, you can make the tech better. But as long as there are stupid people who step in front of a car at night then pedestrians are gonna die.

        --
        When the dust settled America realized it was saved by a porn star.
        • (Score: 2) by Arik on Friday March 23 2018, @12:16PM

          by Arik (4543) on Friday March 23 2018, @12:16PM (#657094) Journal
          "Fact is, 99% of human drivers would have killed that woman."

          That's not a fact, that's a bare assertion, one that flies directly in the face of the evidence available.

          That same scenario happens hundreds if not thousands of times a day yet it's rare for someone to be killed like this. The car appears to have had time enough to swerve around her or to apply brakes and stop, yet *neither was attempted.*

          What's your interest? Uber stockholder?
          --
          If laughter is the best medicine, who are the best doctors?