Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Thursday March 22 2018, @12:34PM   Printer-friendly
from the carmagedon dept.

A few Soylentils wrote in to tell us about a fatal accident between a pedestrian and an autonomous Uber vehicle.

Update - Video Released of Fatal Uber - Pedestrian Accident

I debated just replying to the original story, but this seemed a pretty significant update to me:

The Uber vehicle was operating in autonomous mode when it crashed into 49-year-old Elaine Herzberg on Sunday evening. Herzberg was transported to a hospital, where she later died from her injuries, in what may be the first known pedestrian fatality in a self-driving crash.

The video footage does not conclusively show who is at fault. Tempe police initially reported that Herzberg appeared suddenly; however, the video footage seems to show her coming into view a number of seconds before the crash. It also showed the vehicle operator behind the wheel intermittently looking down while the car was driving itself.

The link shows video of the seconds just before the accident.

The pedestrian did not step out in front of the vehicle, she was essentially out in the middle of the road, and all her lateral movement was nearly irrelevant. She might as well have been a stationary object in the middle of the road. You can see the headlights bring her feet into view first, (meaning she was pretty much in the line before the headlights could see her, and then move up her body; she's already in the middle of the road in front of him when she comes into view.

If I were driving that car, I think I'd have had time to hit brakes (but not stop in time). I also think that that if the camera view is an accurate representation of what was really visible, then the car was overdriving its headlights. Although given my experience with cameras, I wouldn't be surprised if actual visibility was better than what the video shows.

This, in my opinion, is pretty damning.

Police Chief: Uber Self-Driving Car "Likely" Not At Fault In Fatal Crash

The chief of the Tempe Police has told the San Francisco Chronicle that Uber is likely not responsible for the Sunday evening crash that killed 49-year-old pedestrian Elaine Herzberg. “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident," said chief Sylvia Moir.

Herzberg was "pushing a bicycle laden with plastic shopping bags," according to the Chronicle's Carolyn Said, when she "abruptly walked from a center median into a lane of traffic."

After viewing video captured by the Uber vehicle, Moir concluded that “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." Moir added that "it is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available."

Self-Driving Car Testing Likely to Continue Unobstructed

Self-Driving Cars Keep Rolling Despite Uber Crash

The death of a woman who was struck by a self-driving Uber in Arizona on Sunday has auto-safety advocates demanding that U.S. regulators and lawmakers slow down the rush to bring autonomous vehicles to the nation's roadways. Don't count on it.

Efforts to streamline regulations to accommodate the emerging technology have been under way since the Obama administration with strong bipartisan support. And the Trump administration's aversion to restrictions and regulations makes it even more unlikely that the accident in Tempe, Arizona, in which an autonomous Uber sport utility vehicle struck and killed a pedestrian, will result in significant new barriers, according to former U.S. officials and some safety advocates.

"Honestly, the last thing under this administration that car companies and self-driving vehicle developers have to worry about is heavy regulation," said David Friedman, a former National Highway Traffic Safety Administration administrator under President Barack Obama who's now director of cars and product policy for Consumers Union.

Who is to blame when driverless cars have an accident?

[Partial] or full autonomy raises the question of who is to blame in the case of an accident involving a self-driving car? In conventional (human-driven) cars, the answer is simple: the driver is responsible because they are in control. When it comes to autonomous vehicles, it isn't so clear cut. We propose a blockchain-based framework that uses sensor data to ascertain liability in accidents involving self-driving cars.


Original Submission #1Original Submission #2Original Submission #3Original Submission #4

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by ledow on Thursday March 22 2018, @02:50PM (7 children)

    by ledow (5567) on Thursday March 22 2018, @02:50PM (#656613) Homepage

    The car does not dip forward at any point - no significant braking or it literally has no suspension whatsoever (equally dangerous).

    The car also (alternatively) does not swerve or try to manoeuvre at any point - no significant hazard detection or avoidance.

    The car takes NO corrective action whatsoever, looking at the video. Nothing at all. And that it cuts IMMEDIATELY it strikes but does not continue - makes me suspicious that the driver had to force the car to stop or it's actual braking occurs long after the accident has already happened.

    Additionally - if it's that dark, it shouldn't be going that fast. If the headlights are that bad (and the street lights are either similarly bad or it's been tweaked), it shouldn't be going that fast. If it's not actually that dark, it should've seen her a lot sooner (and braked). If the cameras are that bad, the LIDAR should be so much better in their place. But there's no proof it actually tries to do anything at all.

    She's NOT darting out into traffic, she's just wheeling a bike across a road having already crossed one lane and that car comes out of nowhere and strikes her.

    Additionally the human driver is paying NO attention whatsoever despite being legally required to, whether it's an automated car or not.

    Fault: Car and driver.

    Even the local laws on jaywalking could only add additional charges on the part of the pedestrian, not make the car + driver free of blame. It's not like a toddler in dark clothing runs out from behind a series of parked cars to chase a ball that's invisible at night (which I've personally had... lucky I was watching your friends on the side of the road, little fella, isn't it, and picked up that something was happening or about to happen). It's an adult female walking a bike across an empty road who makes the second lane when there are NO OTHER CARS in shot whatsoever, under a street-lit section and doesn't even have time to react to a car driving way in excess of its safe visible distance that takes no corrective action whatsoever.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by kazzie on Thursday March 22 2018, @03:08PM (5 children)

    by kazzie (5309) Subscriber Badge on Thursday March 22 2018, @03:08PM (#656626)

    She's NOT darting out into traffic, she's just wheeling a bike across a road having already crossed one lane and that car comes out of nowhere and strikes her.

    She's also crossing near a set of street lights (but the video doesn't show the area illuminated by them, only what's in the headlights).

    • (Score: 2, Touché) by nitehawk214 on Thursday March 22 2018, @05:04PM

      by nitehawk214 (1304) on Thursday March 22 2018, @05:04PM (#656695)

      And even if the woman was 100% at fault, that does not give a driver (computer or human) permission to run them down. I am willing to bet everyone that has ever driven a vehicle in a area of population more than 1 has had someone illegally cross or pull out in front of them.

      If the machine had reacted at all, it might have hit her non fatally.

      --
      "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
    • (Score: 2) by slinches on Thursday March 22 2018, @09:49PM (3 children)

      by slinches (5049) on Thursday March 22 2018, @09:49PM (#656884)

      Being near to, but not in the illuminated area of a street light is actually worse than there being not a street light at all due to the additional contrast and glare. At least that's the case for visible light cameras and human eyes. Though, the radar and lidar systems should have detected the pedestrian. Those should not be impacted by light and dark.

      I'm questioning whether all of the sensing systems were functional. It's possible that they could have had a failed LIDAR unit or even intentionally disabled some of the sensors in order to evaluate how effective the detection algorithms are with hardware in a degraded state.

      • (Score: 2) by ledow on Thursday March 22 2018, @11:55PM (1 child)

        by ledow (5567) on Thursday March 22 2018, @11:55PM (#656927) Homepage

        And thus the car should have slowed.

        A car that DOESN'T KNOW it can't see is worse than one that can't see at all.

        And, sorry, but any modern camera can adjust to poor lighting in fractions of a second. It may introduce a little noise, but it's a lot quicker than a human eye adjusting.

        If the units fail without any way to detect that, you're going to kill someone anyway. If the car was being driven automatically with a sensor that "knows" it's faulty or not able to operate properly, you're going to kill someone anyway. If the driver is able to ignore or override the warning and STILL not have his hands on the wheel or eyes on the road, you're going to kill someone anyway.

        The problem with automated cars at the moment is systematic. Everything from the Tesla that let a guy fall asleep in traffic without touching the wheel at all, to this one that doesn't even TRY to brake and is going too fast for the lighting/conditions. It's too much trust, instantly, and then saying "Oh, but this bit obviously wasn't working perfectly. We'll fix it in the next version." Explain your product fault to the dead woman's family.

        The process has been given no thought and leapt straight to "let's put these things on the road with real kids", missing out an awful lot of quite obvious lab-testing that should pick up problems like the ones you mention.

        I'm questioning whether the human "driver" (in the loosest sense of the word) will now face a death-by-dangerous-driving charge. If not, why not? And if not, who does get that charge applied to them? Can that car software "lose its licence"? No matter what the manufacturer may have done "voluntarily" in terms of pulling them off the road, who's actually there to say "Nope, that car/software does NOT go back behind the wheel until we ascertain the cause".

        Rather than "would a human have reacted", let's work out who's going to pay the dead woman's funeral bill and face the consequence of shoddy driving. With any luck, it should be both the driver AND the car, just to shit people up about every getting behind the wheel of an automated car like that.

        • (Score: 2) by slinches on Friday March 23 2018, @06:28AM

          by slinches (5049) on Friday March 23 2018, @06:28AM (#657037)

          I wasn't trying to suggest that a failure excuses Uber of any liability. Precisely the opposite, in fact. If they didn't put adequate safeguards for predictable failure modes or were intentionally testing in a way that puts the public at undue risk, then that could make then liable for criminal negligence.

          The question is whether this is an inherent design flaw common to all or a poor implementation by Uber. If it's the former, we need to put a hold on testing these things on public roads until the issue can be resolved. If it's the latter, then there needs to be some serious repercussions for Uber and potentially some additional regulatory oversight for all of the companies.

          Although I don't think this necessarily implies that it was a poor decision to go to live field testing in general. With something like this that has the potential to save thousands of lives per year, some level of risk has to be acceptable in order to accelerate development. There's only so much that can be done with controlled lab tests. At some point you'll have to prove that the systems can handle the random situations only the real environment can provide. The risk would be somewhat lower if they had completed another decade of lab tests, but in that time maybe 10,000 people die in accidents that would have been prevented if the tech was available earlier.. Is that really a preferable outcome?

      • (Score: 2) by kazzie on Friday March 23 2018, @06:40AM

        by kazzie (5309) Subscriber Badge on Friday March 23 2018, @06:40AM (#657040)

        Being near to, but not in the illuminated area of a street light is actually worse than there being not a street light at all due to the additional contrast and glare. At least that's the case for visible light cameras and human eyes.

        I'll grant you that, but I think it's likely that she was in the illuminated area of the street lights. The contrast settings of the camera (apparently calibrated for the brightness of the headlamps) don't show anything at all being illuminated by the street lights.

  • (Score: 2) by wonkey_monkey on Friday March 23 2018, @05:41PM

    by wonkey_monkey (279) on Friday March 23 2018, @05:41PM (#657187) Homepage

    It's an adult female walking a bike across an empty road

    Quite obviously not an empty road.

    and that car comes out of nowhere

    I'm not sure how you think cars work...

    Additionally the human driver is paying NO attention whatsoever despite being legally required to, whether it's an automated car or not.

    Regardless of where blame ultimately lies, there were at least two people not paying as much attention as they could have done in this situation.

    --
    systemd is Roko's Basilisk