Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Thursday March 22 2018, @12:34PM   Printer-friendly
from the carmagedon dept.

A few Soylentils wrote in to tell us about a fatal accident between a pedestrian and an autonomous Uber vehicle.

Update - Video Released of Fatal Uber - Pedestrian Accident

I debated just replying to the original story, but this seemed a pretty significant update to me:

The Uber vehicle was operating in autonomous mode when it crashed into 49-year-old Elaine Herzberg on Sunday evening. Herzberg was transported to a hospital, where she later died from her injuries, in what may be the first known pedestrian fatality in a self-driving crash.

The video footage does not conclusively show who is at fault. Tempe police initially reported that Herzberg appeared suddenly; however, the video footage seems to show her coming into view a number of seconds before the crash. It also showed the vehicle operator behind the wheel intermittently looking down while the car was driving itself.

The link shows video of the seconds just before the accident.

The pedestrian did not step out in front of the vehicle, she was essentially out in the middle of the road, and all her lateral movement was nearly irrelevant. She might as well have been a stationary object in the middle of the road. You can see the headlights bring her feet into view first, (meaning she was pretty much in the line before the headlights could see her, and then move up her body; she's already in the middle of the road in front of him when she comes into view.

If I were driving that car, I think I'd have had time to hit brakes (but not stop in time). I also think that that if the camera view is an accurate representation of what was really visible, then the car was overdriving its headlights. Although given my experience with cameras, I wouldn't be surprised if actual visibility was better than what the video shows.

This, in my opinion, is pretty damning.

Police Chief: Uber Self-Driving Car "Likely" Not At Fault In Fatal Crash

The chief of the Tempe Police has told the San Francisco Chronicle that Uber is likely not responsible for the Sunday evening crash that killed 49-year-old pedestrian Elaine Herzberg. “I suspect preliminarily it appears that the Uber would likely not be at fault in this accident," said chief Sylvia Moir.

Herzberg was "pushing a bicycle laden with plastic shopping bags," according to the Chronicle's Carolyn Said, when she "abruptly walked from a center median into a lane of traffic."

After viewing video captured by the Uber vehicle, Moir concluded that “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway." Moir added that "it is dangerous to cross roadways in the evening hour when well-illuminated, managed crosswalks are available."

Self-Driving Car Testing Likely to Continue Unobstructed

Self-Driving Cars Keep Rolling Despite Uber Crash

The death of a woman who was struck by a self-driving Uber in Arizona on Sunday has auto-safety advocates demanding that U.S. regulators and lawmakers slow down the rush to bring autonomous vehicles to the nation's roadways. Don't count on it.

Efforts to streamline regulations to accommodate the emerging technology have been under way since the Obama administration with strong bipartisan support. And the Trump administration's aversion to restrictions and regulations makes it even more unlikely that the accident in Tempe, Arizona, in which an autonomous Uber sport utility vehicle struck and killed a pedestrian, will result in significant new barriers, according to former U.S. officials and some safety advocates.

"Honestly, the last thing under this administration that car companies and self-driving vehicle developers have to worry about is heavy regulation," said David Friedman, a former National Highway Traffic Safety Administration administrator under President Barack Obama who's now director of cars and product policy for Consumers Union.

Who is to blame when driverless cars have an accident?

[Partial] or full autonomy raises the question of who is to blame in the case of an accident involving a self-driving car? In conventional (human-driven) cars, the answer is simple: the driver is responsible because they are in control. When it comes to autonomous vehicles, it isn't so clear cut. We propose a blockchain-based framework that uses sensor data to ascertain liability in accidents involving self-driving cars.


Original Submission #1Original Submission #2Original Submission #3Original Submission #4

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Immerman on Thursday March 22 2018, @01:56PM (10 children)

    by Immerman (3985) on Thursday March 22 2018, @01:56PM (#656590)

    That's the thing though, isn't it? If that camera view was the car's view, then the car was grossly over-driving it's headlights - it's the responsibility of a driver to drive within the limits of available visibility for exactly such a reason - hitting something stationary in the road is ALWAYS your fault.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 1, Insightful) by Anonymous Coward on Thursday March 22 2018, @02:23PM (3 children)

    by Anonymous Coward on Thursday March 22 2018, @02:23PM (#656601)

    There's some precise question about how accurate the footage is in that respect, however, it does appear that the headlights weren't properly aimed, for whatever reason. If you watch the video, only a handful of strips are actually visible in the video, which indicates that the headlights aren't showing as far away as they should. Even if it hadn't been a person, a car with headlights like that is liable to run into all manner of items in the road because they won't be visible until it's too late.

    Really, this shouldn't be unexpected, Uber has a well deserved reputation for not thinking about the consequences of their actions. It wasn't that long ago that the drivers didn't have insurance policies to cover commercial driving and the company still doesn't have real permission to operate as a taxi service in most areas.

    • (Score: 0) by Anonymous Coward on Thursday March 22 2018, @05:32PM (2 children)

      by Anonymous Coward on Thursday March 22 2018, @05:32PM (#656717)

      Regular headlights are aimed low. This is required by law. You are not allowed to blind the oncoming drivers.

      Vehicles are also required to have bright headlights. These are required to be aimed higher, for greater reach. These are to be used when there isn't oncoming traffic.

      Does the self-driving system operate the headlights at all? (maybe the user is left to flip them on) If it operates them, does it ever turn on the brights?

      • (Score: 4, Insightful) by frojack on Thursday March 22 2018, @08:59PM (1 child)

        by frojack (1554) on Thursday March 22 2018, @08:59PM (#656862) Journal

        What does the law sat about infrared headlights that only can be seen by cameras, wise guy?

        If the drive system is ONLY using the camera supplying this video, then that is engineering malfeasances.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 3, Insightful) by JoeMerchant on Thursday March 22 2018, @09:35PM

          by JoeMerchant (3937) on Thursday March 22 2018, @09:35PM (#656876)

          As soon as there are two IR sensing cars on the road, those IR high beams will need to be dimmed for oncoming traffic just like visible light.

          --
          🌻🌻 [google.com]
  • (Score: 1, Interesting) by Anonymous Coward on Thursday March 22 2018, @02:55PM (3 children)

    by Anonymous Coward on Thursday March 22 2018, @02:55PM (#656619)

    That video has been tampered with to make it darker. I also doubt that video is used by the car in the processing of information.

    Release the LIDAR. There is NO reason LIDAR should not have picked this up.

    • (Score: 5, Informative) by kazzie on Thursday March 22 2018, @03:03PM (2 children)

      by kazzie (5309) Subscriber Badge on Thursday March 22 2018, @03:03PM (#656621)

      Supporting evidence for this assertion:

      The pedestrian is crossing 5~10 metres away from a pair of street lights (a sensible place to cross and be seen) but is not visible in the video until she's covered by the (low) beam of the headlights.

      • (Score: 0) by Anonymous Coward on Thursday March 22 2018, @05:04PM

        by Anonymous Coward on Thursday March 22 2018, @05:04PM (#656694)

        LIDAR has nothing to do with 'visible'. The vehicle is clearly lacking if it can't see in the 'dark'. Airliners have TCAS. There is no reason not to supply other vehicles with the same type of technology. Every cel phone can have a transmitter.

        Ahhh, but the money shot was the driver's face. Worth a million bucks that was!

      • (Score: 1, Interesting) by Anonymous Coward on Thursday March 22 2018, @06:25PM

        by Anonymous Coward on Thursday March 22 2018, @06:25PM (#656760)

        How this got promoted up in in question.

        But im sorry being visible to the camera means nothing to being visible to LIDAR.

        The car either saw her, and disregarded it, or did not see her. Either way the car's hardware or software is at fault. There is NO FREAKING WAY that that video is representative of what the car actually saw.

  • (Score: 0) by Anonymous Coward on Thursday March 22 2018, @10:45PM (1 child)

    by Anonymous Coward on Thursday March 22 2018, @10:45PM (#656898)

    The car didn't hit something that was stationary.

    The pedestrian WALKED INTO THE PATH OF THE CAR.

    I am guessing you don't drive much, if at all, because your failure to note the pedestrian was MOVING is a basic mistake that a person who has much driving experience is unlikely to make.

    • (Score: 0) by Anonymous Coward on Friday March 23 2018, @08:41AM

      by Anonymous Coward on Friday March 23 2018, @08:41AM (#657062)

      The pedestrian might as well be stationary, the head lights hit her shoes first, not the front wheel of her bicycle like it would if she walked into the light beam.

      Had it been a block of concrete instead, the only difference would be which side of the car was damaged.