Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday May 26 2018, @12:19AM   Printer-friendly

Uber Self-Driving Car That Struck, Killed Pedestrian Wasn't Set To Stop In An Emergency

An Uber Technologies Inc. car involved in a deadly crash in Arizona wasn’t designed to automatically brake in case of an emergency, the National Transportation Safety Board said in its preliminary report on the accident.

The self-driving car, which was being tested on a public road with a human operator, struck and killed a pedestrian in Arizona in March. Uber said Wednesday that it was closing down its self-driving vehicle program in the state, two months after Arizona barred it from road-testing the technology.

NTSB Preliminary Report Released

The NTSB has released a preliminary report on the Uber pedestrian accident in Arizona. This is being reported widely, but it took a little digging to find the actual report, which is linked from the NTSB press release at: https://www.ntsb.gov/news/press-releases/Pages/NR20180524.aspx The PDF report, 3-1/2 pages with several illustrations, can be downloaded directly with https://goo.gl/2C6ZCH

From the second page of the report:

Uber had equipped the test vehicle with a developmental self-driving system. The system consisted of forward- and side-facing cameras, radars, LIDAR, navigation sensors, and a computing and data storage unit integrated into the vehicle. 1 Uber had also equipped the vehicle with an aftermarket camera system that was mounted in the windshield and rear window and that provided additional front and rear videos, along with an inward-facing view of the vehicle operator. In total, 10 camera views were recorded over the course of the entire trip.

The self-driving system relies on an underlying map that establishes speed limits and permissible lanes of travel. The system has two distinct control modes: computer control and manual control. The operator can engage computer control by first enabling, then engaging the system in a sequence similar to activating cruise control. The operator can transition from computer control to manual control by providing input to the steering wheel, brake pedal, accelerator pedal, a disengage button, or a disable button.

The vehicle was factory equipped with several advanced driver assistance functions by Volvo Cars, the original manufacturer. The systems included a collision avoidance function with automatic emergency braking, known as City Safety, as well as functions for detecting driver alertness and road sign information. All these Volvo functions are disabled when the test vehicle is operated in computer control but are operational when the vehicle is operated in manual control.

According to Uber, the developmental self-driving system relies on an attentive operator to intervene if the system fails to perform appropriately during testing. In addition, the operator is responsible for monitoring diagnostic messages that appear on an interface in the center stack of the vehicle dash and tagging events of interest for subsequent review.

On the night of the crash, the operator departed Uber's garage with the vehicle at 9:14 p.m. to run an established test route. At the time of the crash, the vehicle was traveling on its second loop of the test route and had been in computer control since 9:39 p.m. (i.e., for the preceding 19 minutes).

According to data obtained from the self-driving system, the system first registered radar and LIDAR observations of the pedestrian about 6 seconds before impact, when the vehicle was traveling at 43 mph. As the vehicle and pedestrian paths converged, the self-driving system software classified the pedestrian as an unknown object, as a vehicle, and then as a bicycle with varying expectations of future travel path. At 1.3 seconds before impact, the self-driving system determined that an emergency braking maneuver was needed to mitigate a collision (see figure 2). 2 According to Uber, emergency braking maneuvers are not enabled while the vehicle is under computer control, to reduce the potential for erratic vehicle behavior. The vehicle operator is relied on to intervene and take action. The system is not designed to alert the operator.


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Funny) by MostCynical on Saturday May 26 2018, @12:52AM (3 children)

    by MostCynical (2589) on Saturday May 26 2018, @12:52AM (#684286) Journal

    Uber has judged you, pedestrian, and you are not worthy.
    let this be a lesson to you- stay out of the way.

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
    Starting Score:    1  point
    Moderation   +1  
       Funny=1, Total=1
    Extra 'Funny' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 5, Insightful) by Gaaark on Saturday May 26 2018, @03:45AM (2 children)

    by Gaaark (41) on Saturday May 26 2018, @03:45AM (#684341) Journal

    Funny yes, but also quite true: they DID make a conscious decision to disable the system do it wouldn't make false, annoying braking.

    So instead of pushing the safety in the system to 11, they jacked it down to, say, 2 and ,"we'll deal with the consequences later".
    Someone dies, we pressure the family with "she was crossing the road illegally, here's a fuck load of money, now go away" and the family went "O'tay!"

    Should be forced to turn it back up to 11.

    --
    --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
    • (Score: 0) by Anonymous Coward on Saturday May 26 2018, @09:50AM (1 child)

      by Anonymous Coward on Saturday May 26 2018, @09:50AM (#684426)

      > they DID make a conscious decision to disable the system ...

      Not "they". Someone (or someones) had to make that decision. What kind of software professional would continue to work for a company where that decision was possible? I'd like to think that I would have quit long before (but of course hindsight is 20-20).

      • (Score: 1, Insightful) by Anonymous Coward on Saturday May 26 2018, @11:20AM

        by Anonymous Coward on Saturday May 26 2018, @11:20AM (#684460)

        The software people most likely did it with the understanding that the vehicle would have a full time driver watching the road while a tech monitored the system. If that understanding had been correct, there might not have been an accident, as the driver could have avoided the collision had she not been distracted. Combining the driving and monitoring jobs is the core problem, and that falls entirely on management.