Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday November 06 2019, @11:59PM   Printer-friendly
from the just-switch-off-and-start-it-again dept.

Submitted via IRC for SoyCow1337

In review of fatal Arizona crash, U.S. agency says Uber software had flaws

WASHINGTON (Reuters) - An Uber self-driving test vehicle that struck and killed an Arizona woman in 2018 had software flaws, the National Transportation Safety Board said Tuesday as it disclosed the company’s autonomous test vehicles were involved in 37 crashes over the prior 18 months.

NTSB may use the findings from the first fatal self-driving car accident to make recommendations that could impact how the entire industry addresses self-driving software issues or to regulators about how to oversee the industry.

The board will meet Nov. 19 to determine the probable cause of the March 2018 accident in Tempe, Arizona that killed 49-year-old Elaine Herzberg as she was walking a bicycle across a street at night.

In a report released ahead of the meeting, the NTSB said the Uber Technologies Inc vehicle had failed to properly identify her as a pedestrian crossing a street.

That accident prompted significant safety concerns about the nascent self-driving car industry, which is working to get vehicles into commercial use.

In the aftermath of the crash, Uber suspended all testing and did not resume until December in Pennsylvania with revised software and significant new restrictions and safeguards,

A spokeswoman for Uber's self-driving car effort, Sarah Abboud, said the company regretted the crash that killed Herzberg and noted it has “adopted critical program improvements to further prioritize safety. We deeply value the thoroughness of the NTSB's investigation into the crash and look forward to reviewing their recommendations.”

The NTSB reported at least two prior crashes in which Uber test vehicles may not have identified roadway hazards. The NTSB said between September 2016 and March 2018, there were 37 crashes of Uber vehicles in autonomous mode, including 33 that involved another vehicle striking test vehicles.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by PiMuNu on Thursday November 07 2019, @10:48AM (3 children)

    by PiMuNu (3823) on Thursday November 07 2019, @10:48AM (#917271)

    FTFA:
    > The system design did not include a consideration for jaywalking pedestrians

    One of the goals which a *programmer/data analyst* should have optimised for should have been objects straying onto the road while not at a crossing; be they people, animals, or anything else. This was *not* included in the algorithm. Allowing such an algorithm onto the road was an accident waiting to happen and I hope uber get sued to oblivion for it.

    See also El Reg:
    https://www.theregister.co.uk/2019/11/06/uber_self_driving_car_death/ [theregister.co.uk]

    Nb: Use of AI makes me angry, this is *NOT* intelligent in any normal sense of the word. It is an optimisation routine that optimises for certain programmer-led goals.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 1) by khallow on Friday November 08 2019, @12:57PM (2 children)

    by khallow (3766) Subscriber Badge on Friday November 08 2019, @12:57PM (#917845) Journal

    This was *not* included in the algorithm.

    That's a ridiculous assertion to make. First, an actual optimization would be to not drive at all. Doing anything with forward motion at all disrupts the optimization. Second, these vehicles would have run into a lot more stuff, if they weren't trying very hard not to.

    • (Score: 3, Interesting) by PiMuNu on Friday November 08 2019, @03:33PM (1 child)

      by PiMuNu (3823) on Friday November 08 2019, @03:33PM (#917888)

      Quotes from theregister:

      https://www.theregister.co.uk/2019/11/06/uber_self_driving_car_death/ [theregister.co.uk]

      the code couldn't recognize her as a pedestrian, because she was not at an obvious designated crossing.

      “The system design did not include a consideration for jaywalking pedestrians,” the watchdog stated

      > First, an actual optimization would be to not drive at all.

      I think you misunderstand my use of "optimization". The algorithms in question are what I would consider minimisation or optimisation routines. From a set of input variables, they seek to find an optimised or minimised set of parameters to achieve some programmer-led goals. For example, based on a set of pixels, they seek to find the "best fit" or "optimal" identification for what the object was e.g. "bicycle", "car" or "fish", and what its path was, e.g. "stationary", "about to hit the car", etc.

      If the programmer tells the algorithm to only allow objects to be potentially crossing if near a pedestrian crossing; then that would seem negligent but IANAL. This is what theregister article is implying.

      • (Score: 1) by khallow on Saturday November 09 2019, @02:14AM

        by khallow (3766) Subscriber Badge on Saturday November 09 2019, @02:14AM (#918129) Journal

        the code couldn't recognize her as a pedestrian, because she was not at an obvious designated crossing.

        That has nothing to do with optimization. I think we can agree that the software was doing it wrong (just in your sentence, "recognize as a pedestrian" when it shouldn't matter what she gets recognized as, "designated crossing" being in the decision process for spotting objects in the road, etc), but that doesn't mean the software wasn't looking. Getting rid of such bugs are after all the point of this test driving in the first place.

        The algorithms in question are what I would consider minimisation or optimisation routines. From a set of input variables, they seek to find an optimised or minimised set of parameters to achieve some programmer-led goals.

        You should have a factor of safety here. These shouldn't be anywhere close to a minimization/optimization just in case some unforeseen system or environmental issue pushes the scenario into a dangerous failure mode.