Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by takyon on Thursday March 22 2018, @12:34PM   Printer-friendly
from the carmagedon dept.

A few Soylentils wrote in to tell us about a fatal accident between a pedestrian and an autonomous Uber vehicle.

Update - Video Released of Fatal Uber - Pedestrian Accident

I debated just replying to the original story, but this seemed a pretty significant update to me:

The Uber vehicle was operating in autonomous mode when it crashed into 49-year-old Elaine Herzberg on Sunday evening. Herzberg was transported to a hospital, where she later died from her injuries, in what may be the first known pedestrian fatality in a self-driving crash.

The video footage does not conclusively show who is at fault. Tempe police initially reported that Herzberg appeared suddenly; however, the video footage seems to show her coming into view a number of seconds before the crash. It also showed the vehicle operator behind the wheel intermittently looking down while the car was driving itself.

The link shows video of the seconds just before the accident.

The pedestrian did not step out in front of the vehicle, she was essentially out in the middle of the road, and all her lateral movement was nearly irrelevant. She might as well have been a stationary object in the middle of the road. You can see the headlights bring her feet into view first, (meaning she was pretty much in the line before the headlights could see her, and then move up her body; she's already in the middle of the road in front of him when she comes into view.

If I were driving that car, I think I'd have had time to hit brakes (but not stop in time). I also think that that if the camera view is an accurate representation of what was really visible, then the car was overdriving its headlights. Although given my experience with cameras, I wouldn't be surprised if actual visibility was better than what the video shows.

This, in my opinion, is pretty damning.

Police Chief: Uber Self-Driving Car "Likely" Not At Fault In Fatal Crash

The chief of the Tempe Police has told the San Francisco Chronicle that Uber is likely not responsible for the Sunday evening crash that killed 49-year-old pedestrian Elaine Herzberg. “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident," said chief Sylvia Moir.

Herzberg was "pushing a bicycle laden with plastic shopping bags," according to the Chronicle's Carolyn Said, when she "abruptly walked from a center median into a lane of traffic."

After viewing video captured by the Uber vehicle, Moir concluded that “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." Moir added that "it is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available."

Self-Driving Car Testing Likely to Continue Unobstructed

Self-Driving Cars Keep Rolling Despite Uber Crash

The death of a woman who was struck by a self-driving Uber in Arizona on Sunday has auto-safety advocates demanding that U.S. regulators and lawmakers slow down the rush to bring autonomous vehicles to the nation's roadways. Don't count on it.

Efforts to streamline regulations to accommodate the emerging technology have been under way since the Obama administration with strong bipartisan support. And the Trump administration's aversion to restrictions and regulations makes it even more unlikely that the accident in Tempe, Arizona, in which an autonomous Uber sport utility vehicle struck and killed a pedestrian, will result in significant new barriers, according to former U.S. officials and some safety advocates.

"Honestly, the last thing under this administration that car companies and self-driving vehicle developers have to worry about is heavy regulation," said David Friedman, a former National Highway Traffic Safety Administration administrator under President Barack Obama who's now director of cars and product policy for Consumers Union.

Who is to blame when driverless cars have an accident?

[Partial] or full autonomy raises the question of who is to blame in the case of an accident involving a self-driving car? In conventional (human-driven) cars, the answer is simple: the driver is responsible because they are in control. When it comes to autonomous vehicles, it isn't so clear cut. We propose a blockchain-based framework that uses sensor data to ascertain liability in accidents involving self-driving cars.


Original Submission #1Original Submission #2Original Submission #3Original Submission #4

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by theluggage on Friday March 23 2018, @05:00PM

    by theluggage (1797) on Friday March 23 2018, @05:00PM (#657169)

    In many countries it is not illegal to cross the road when not at a crossing. Whether that is a smart to do is another question.

    That's not the point.

    In virtually all countries, running people over with cars is frowned upon, even if the pedestrian was partly at fault. Even if the driver is not legally liable, it is not a Good Thing to have happen and taking steps to avoid it is highly recommended.

    As others have commented - if that video was an honest representation of the visibility from the car, the car was driving too fast towards a dark void it couldn't see. If it wasn't, there's nothing blocking the line of sight between the car and the pedestrian so the car should have detected her in time to stop or swerve. In fact, it looks as if the car just totally ignored the pedestrian, even after she appeared on the dashcam. It doesn't really matter whether the "obstacle" was a careless pedestrian, an animal, a fallen tree, or pile of bricks fallen off the back of a truck, the car should have reacted. If the pedestrian was crossing in a dark patch without looking then they are at fault as well not instead -

    Its also the sort of thing that self-driving vehicles should be good at - "what to do if an obstacle is detected in the next 3 seconds - is there a car behind, is the opposite lane ahead clear, what's the stopping distance?" should be a sub-process ticking away in the background, just like the mythical text-book-perfect human driver.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2