Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Tuesday May 08 2018, @03:11PM   Printer-friendly
from the false-negative dept.

The first machine to kill a human entirely on its own initiative was "Likely Caused By Software Set to Ignore Objects On Road" according to a new report on the collision which happened last March:

The car's sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber's software decided it didn't need to react right away. That's a result of how the software was tuned. Like other autonomous vehicle systems, Uber's software has the ability to ignore "false positives," or objects in its path that wouldn't actually be a problem for the vehicle, such as a plastic bag floating over a road. In this case, Uber executives believe the company's system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn't react fast enough, one of these people said.

Fast enough? She walked across three and a half lanes in what should have been plain view of the car's LIDAR the entire time.

takyon: Also at Reuters. Older report at The Drive.

Previously: Uber Pulls Self-Driving Cars After First Fatal Crash of Autonomous Vehicle
Video Released of Fatal Uber - Pedestrian Accident, and More


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Insightful) by Anonymous Coward on Tuesday May 08 2018, @03:42PM (10 children)

    by Anonymous Coward on Tuesday May 08 2018, @03:42PM (#677056)

    Uber's software has the ability to ignore "false positives," or objects in its path that wouldn't actually be a problem for the vehicle, such as a plastic bag floating over a road.

    So basically it assumes to drive over it, unless classified as something it shouldn't drive over. Shouldn't that be the other way around? Like, safety first, on which most traffic laws are based on?

    Starting Score:    0  points
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  

    Total Score:   1  
  • (Score: 2, Informative) by takyon on Tuesday May 08 2018, @03:48PM (5 children)

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Tuesday May 08 2018, @03:48PM (#677060) Journal

    Drivers don't stop when a plastic bag or tumbleweed flies in front of their car.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @03:57PM (4 children)

      by Anonymous Coward on Tuesday May 08 2018, @03:57PM (#677066)

      you're not paying attention. the op said "laws are based on", as in "the pedestrian has the right away for this very reason."

      • (Score: 2) by tangomargarine on Tuesday May 08 2018, @04:21PM

        by tangomargarine (667) on Tuesday May 08 2018, @04:21PM (#677076)

        The phrase you're looking for is "right of way."

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 3, Insightful) by takyon on Tuesday May 08 2018, @04:22PM (2 children)

        by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Tuesday May 08 2018, @04:22PM (#677077) Journal

        If the car was paralyzed by any object it couldn't "definitively" identify, it would be stopping far too often.

        Autonomous car designers already err on the side of caution and have the car stop in annoying ways. Uber was trying to swing slightly the other way, and went too far. Only the laws that regulate autonomous vehicle testing have anything to do with it.

        --
        [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @10:32PM

          by Anonymous Coward on Tuesday May 08 2018, @10:32PM (#677210)

          At first they would stop too often. As time goes by they'd get smarter about it and just stop for things that do require it.

          If they're going to operate on public roads they should be extremely, over the top paranoid about running into unknown objects. And they should be operating slowly enough for any vehicles behind them to stop even with a maximum application of the brakes.

        • (Score: 0) by Anonymous Coward on Wednesday May 09 2018, @02:09AM

          by Anonymous Coward on Wednesday May 09 2018, @02:09AM (#677296)

          > If the car was paralyzed by any object it couldn't "definitively" identify, it would be stopping far too often.

          This seems like a very good reason that it's too early to be testing these things on public roads. They need to be at least as good as an alert person (not distracted or impaired) at identifying every object/person/animal that could possibly get into their path.

          Did the Uber car have radar? If so, that bicycle (metal) that was being walked across the road should have returned a signal well in advance, plenty of time to slow down and avoid running over that poor person.

  • (Score: 5, Insightful) by ledow on Tuesday May 08 2018, @04:02PM (3 children)

    by ledow (5567) on Tuesday May 08 2018, @04:02PM (#677068) Homepage

    Nope.

    Slamming the brakes on because of a bit of tumbleweed blows across would probably cause more accidents.

    And what if that bag was just lying in the road with a brick in it? You could probably tell as a human that it "wasn't right".
    And you'd probably still try to slow/avoid it even if it's just a blowing empty bag (those things tend to stick to your exhaust and melt with a horrendous smell).

    Fact is, the car wasn't able to judge, because it didn't have any sensor capable of judging such things, nor of any intelligence to apply to the sensors it had to judge such things.

    Imagine being in a court. And quite literally saying "Sorry, your Honour, but I couldn't tell if the woman with the bike was a paper bag or not, so I just drove over it".
    Would it pass muster for a human? No. So why would a company's software claiming to do that human's job be any different?

    Fact is, the software isn't there to make these kinds of decisions. It all has to be "tuned" (my prime argument against anyone calling anything we have today AI - they are human-fed heuristics at best, and poor ones at that) because it can't infer anything about the situation whatsoever. All the fancy claims just come down to someone turning a software dial between "run over plastic bags" and "mow down little old ladies". If they tune it wrong, the software goes wrong. And obviously that's what's happened here.

    This stuff isn't ready to risk people's lives on.

    Give it a few more decades of off-road testing and you might get closer.

    • (Score: 3, Insightful) by bob_super on Tuesday May 08 2018, @05:38PM (1 child)

      by bob_super (1357) on Tuesday May 08 2018, @05:38PM (#677105)

      > Fact is, the car wasn't able to judge, because it didn't have any sensor capable of judging such things,
      > nor of any intelligence to apply to the sensors it had to judge such things.

      I'm not an autonomous car designer. But if I was, the first thing to design and qualify on the vision system would be "can you identify a walking or standing human?".

      The plastic bag is a terrible example. The whole point of the sensor system is to clearly sort human shapes, and tag their position and speed. The vision and classification system, presented with a few seconds of a walking woman, should never ever have classified her as anything else than "Walking human, going right at walking pace". It's a very distinctive shape, and "we have to avoid breaking for plastic bags" is total bullshit. We're not looking at breaking for a human chalk outline, a kid's toy, or a plastic bag, but at not correctly figuring out lidar test case number 1: full size human walking perpendicular to travel direction.

      • (Score: 4, Disagree) by ledow on Tuesday May 08 2018, @07:30PM

        by ledow (5567) on Tuesday May 08 2018, @07:30PM (#677149) Homepage

        Now distinguish between a plastic bag and a guy crouching in a shiny black puffer jacket (google it if you're unfamiliar) to tie his shoelace in the middle of the road. If you haven't had stuff like that happen to you, everything from deer, to children running between parked cars, to things like tyres rolling into the road in front of you, then you haven't been driving long or you haven't been paying attention enough that you COULD interpret them as a threat.

        You have literally no way to teach a computer the difference in any simple fashion. Certainly not in dark conditions (but then it shouldn't be making any decision about things it can't illuminate clearly), and certainly not at the speeds involved here.

        Even humans get it wrong. But humans can be brought to bear before a court to explain themselves. A computer system cannot, at the moment. A human can tell you that it was dark, and he looked exactly like an bin bag, etc. and a court may believe them. The computers involved here do not have that kind of analysis, or ability to justify themselves. It was 51% a hazard they could drive through and 49% not, so they rammed it without even trying to brake.

        If you're going to have different systems on the same roads and claim equivalence (or, worse, superiority as the self-driving car manufacturers attempt), then they need to be just as accountable as a human in the same position.

        In this incident, my first question as a lawyer would be "So, Mr Driver, what was your visibility and were you doing an appropriate speed for the conditions? (Answer: No, as they hit something that literally appears in their headlights way within their braking distance making it impossible to stop). How long did you get to assess the object you hit and whether or not to hit it? (Answer: A second or two). At which point do you take action to avoid a collision? (Answer: None, watch the video - no braking, no moving, nothing). And, crucially, what other hazards were there on the road at the time? (Answer: None. Not raining, not snowing, not oily ground, no car in front, behind or to the side, no crowds of pedestrians, etc. etc. etc.)"

        You'd convict a human driver of causing death by dangerous driving in such circumstances. So what's the sanction for the manufacturer now?

    • (Score: 2) by choose another one on Tuesday May 08 2018, @07:19PM

      by choose another one (515) Subscriber Badge on Tuesday May 08 2018, @07:19PM (#677146)

      > Imagine being in a court. And quite literally saying "Sorry, your Honour, but I couldn't tell if the woman with the bike was a paper bag or not, so I just drove over it".

      But that isn't what happened, your statement implies you have already decided the object is a woman with a bike, and then drove over it.
      A more accurate comparison would be: "Sorry your Honour but I couldn't tell if what I saw in the road was a black bag or some clothes or a human being, so I just drove over it".

      > Would it pass muster for a human?

      Yes, sometimes at least - in fact I know of cases where (my version, roughly) it has.

      I also know of cases where the same argument didn't pass muster - but it took two trials to get a verdict and was probably because reconstructions showed the obstacle-that-was-actually-a-human would have been visible for over 100m in a 30mph limit, giving 9 seconds to act (the driver wasn't speeding). The Uber vehicle was going faster (but not speeding) and a human driving it would have had a lot less time.