Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday May 08 2018, @03:11PM   Printer-friendly
from the false-negative dept.

The first machine to kill a human entirely on its own initiative was "Likely Caused By Software Set to Ignore Objects On Road" according to a new report on the collision which happened last March:

The car's sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber's software decided it didn't need to react right away. That's a result of how the software was tuned. Like other autonomous vehicle systems, Uber's software has the ability to ignore "false positives," or objects in its path that wouldn't actually be a problem for the vehicle, such as a plastic bag floating over a road. In this case, Uber executives believe the company's system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn't react fast enough, one of these people said.

Fast enough? She walked across three and a half lanes in what should have been plain view of the car's LIDAR the entire time.

takyon: Also at Reuters. Older report at The Drive.

Previously: Uber Pulls Self-Driving Cars After First Fatal Crash of Autonomous Vehicle
Video Released of Fatal Uber - Pedestrian Accident, and More


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by takyon on Tuesday May 08 2018, @03:48PM (5 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday May 08 2018, @03:48PM (#677060) Journal

    Drivers don't stop when a plastic bag or tumbleweed flies in front of their car.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Moderation   0  
       Offtopic=1, Informative=1, Total=2
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @03:57PM (4 children)

    by Anonymous Coward on Tuesday May 08 2018, @03:57PM (#677066)

    you're not paying attention. the op said "laws are based on", as in "the pedestrian has the right away for this very reason."

    • (Score: 2) by tangomargarine on Tuesday May 08 2018, @04:21PM

      by tangomargarine (667) on Tuesday May 08 2018, @04:21PM (#677076)

      The phrase you're looking for is "right of way."

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
    • (Score: 3, Insightful) by takyon on Tuesday May 08 2018, @04:22PM (2 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday May 08 2018, @04:22PM (#677077) Journal

      If the car was paralyzed by any object it couldn't "definitively" identify, it would be stopping far too often.

      Autonomous car designers already err on the side of caution and have the car stop in annoying ways. Uber was trying to swing slightly the other way, and went too far. Only the laws that regulate autonomous vehicle testing have anything to do with it.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @10:32PM

        by Anonymous Coward on Tuesday May 08 2018, @10:32PM (#677210)

        At first they would stop too often. As time goes by they'd get smarter about it and just stop for things that do require it.

        If they're going to operate on public roads they should be extremely, over the top paranoid about running into unknown objects. And they should be operating slowly enough for any vehicles behind them to stop even with a maximum application of the brakes.

      • (Score: 0) by Anonymous Coward on Wednesday May 09 2018, @02:09AM

        by Anonymous Coward on Wednesday May 09 2018, @02:09AM (#677296)

        > If the car was paralyzed by any object it couldn't "definitively" identify, it would be stopping far too often.

        This seems like a very good reason that it's too early to be testing these things on public roads. They need to be at least as good as an alert person (not distracted or impaired) at identifying every object/person/animal that could possibly get into their path.

        Did the Uber car have radar? If so, that bicycle (metal) that was being walked across the road should have returned a signal well in advance, plenty of time to slow down and avoid running over that poor person.