Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday May 08 2018, @03:11PM   Printer-friendly
from the false-negative dept.

The first machine to kill a human entirely on its own initiative was "Likely Caused By Software Set to Ignore Objects On Road" according to a new report on the collision which happened last March:

The car's sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber's software decided it didn't need to react right away. That's a result of how the software was tuned. Like other autonomous vehicle systems, Uber's software has the ability to ignore "false positives," or objects in its path that wouldn't actually be a problem for the vehicle, such as a plastic bag floating over a road. In this case, Uber executives believe the company's system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn't react fast enough, one of these people said.

Fast enough? She walked across three and a half lanes in what should have been plain view of the car's LIDAR the entire time.

takyon: Also at Reuters. Older report at The Drive.

Previously: Uber Pulls Self-Driving Cars After First Fatal Crash of Autonomous Vehicle
Video Released of Fatal Uber - Pedestrian Accident, and More


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by theluggage on Tuesday May 08 2018, @05:26PM (1 child)

    by theluggage (1797) on Tuesday May 08 2018, @05:26PM (#677101)

    It was a faulty safety driver.

    The idea of a minimum-wage safety driver is faulty. The people supervising these cars should be expert drivers - police drivers, advanced driving instructors, advanced police driving instructors, etc. who are verging on OCD about correct car handling and safety - and they shouldn't be updating their facebook, they should be dictating a constant stream of commentary on how the car is driving from minute to minute. In the first instance, they should be driving and the computer's decisions should be compared against theirs. For one thing, that means that you'll actually get meaningful feedback on the system's performance instead of "Hey, yeah, it didn't crash this time".

    Starting Score:    1  point
    Moderation   +3  
       Insightful=1, Interesting=2, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 4, Insightful) by bob_super on Tuesday May 08 2018, @05:44PM

    by bob_super (1357) on Tuesday May 08 2018, @05:44PM (#677107)

    The test driver should be driving at all times. The computer would be comparing its decisions to the driver's decisions (Waymo probably went through that phase).
    Whoever is connected to the road should change randomly. That isn't as freaky as it sounds, if you remember the wide roads and low speed limits. The human driver would have a trigger button on the wheel to guarantee they are in control when the wheel is grabbed tightly.