Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by martyb on Saturday June 23 2018, @03:41PM   Printer-friendly
from the unfortunate dept.

According to this article on MSN:

Police in Tempe, Arizona said evidence showed the "safety" driver behind the wheel of a self-driving Uber was distracted and streaming a television show on her phone right up until about the time of a fatal accident in March, deeming the crash that rocked the nascent industry "entirely avoidable."

A 318-page report from the Tempe Police Department, released late on Thursday in response to a public records request, said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up just a half second before the car hit 49-year-old Elaine Herzberg, who was crossing the street at night.

According to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.

Police obtained records from Hulu, an online service for streaming television shows and movies, which showed Vasquez's account was playing the television talent show "The Voice" the night of the crash for about 42 minutes, ending at 9:59 p.m., which "coincides with the approximate time of the collision," the report says.

It is not clear if Vasquez will be charged, and police submitted their findings to county prosecutors, who will make the determination.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by frojack on Saturday June 23 2018, @04:33PM (26 children)

    by frojack (1554) on Saturday June 23 2018, @04:33PM (#697223) Journal

    On the other hand, it was a self driving car.

    If self driving car companies are going to hand off control (and culpability) to humans when the car is already in an emergency, then they serve not value, and should be outlawed.

    There is already information that the car was programmed incorrectly by uber, then handed to an employee who was told it was self driving.

    You can't have it both ways.

    --
    No, you are mistaken. I've always had this sig.
    Starting Score:    1  point
    Moderation   +3  
       Flamebait=1, Insightful=3, Interesting=1, Disagree=1, Total=6
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 5, Insightful) by DrkShadow on Saturday June 23 2018, @04:46PM

    by DrkShadow (1404) on Saturday June 23 2018, @04:46PM (#697226)

    There is already information that the car was programmed incorrectly by uber, then handed to an employee who was told

    The only thing it makes sense to have told the driver is that this car is in testing, will make mistakes, and you, the driver, are there to assure that a mistake by this car doesn't lead to something catastrophic. You're expected to minimize liability and be able to mitigate any and all errors should they happen.

  • (Score: 5, Informative) by maxwell demon on Saturday June 23 2018, @04:46PM (9 children)

    by maxwell demon (1608) on Saturday June 23 2018, @04:46PM (#697227) Journal

    He was a test driver. The whole reason he was employed was because the self-driving car is still in development and therefore not yet assumed to be safe for unsupervised operation. It was his job to prevent accidents like this.

    --
    The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2, Offtopic) by frojack on Saturday June 23 2018, @04:49PM (3 children)

      by frojack (1554) on Saturday June 23 2018, @04:49PM (#697229) Journal

      I know its all the rage these days to be gender fluid, but the police said the safety driver was a she.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 0, Offtopic) by Anonymous Coward on Saturday June 23 2018, @07:10PM

        by Anonymous Coward on Saturday June 23 2018, @07:10PM (#697314)

        which is particularly annoying, because it would not be politically correct to emphasize that we're dealing with a woman driver here.

      • (Score: 2) by Gaaark on Saturday June 23 2018, @09:08PM (1 child)

        by Gaaark (41) on Saturday June 23 2018, @09:08PM (#697346) Journal

        Damn, dog!
        I thought a SECOND uber car must have hit poor Elaine, because i would have testified the first driver was a man.

        D.A.M.N!

        --
        --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
        • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @02:01PM

          by Anonymous Coward on Sunday June 24 2018, @02:01PM (#697556)

          > i would have testified the first driver was a man.
          i would have testified the first driver was male.

          To be a man, he would have to man-up and apologize publicly(?)

    • (Score: 5, Insightful) by EETech1 on Saturday June 23 2018, @06:17PM (4 children)

      by EETech1 (957) on Saturday June 23 2018, @06:17PM (#697294)

      Need a +1 exactly mod for this!

      I used to do integration testing of a certain Marine drive-by-wire system with various aftermarket autopilots, and while it was really nice to sit in the breeze while being driven around the area on an endless scenic tour, there's countless things that constantly go wrong, especially during active development. It could be an entirely different vehicle after lunch. Same drivers seat, steering wheel, buttons etc. But deep inside there, the software running it all was changed (release notes FTW) so this time around the lake is not like the last!

      Even out on the open water, there's still hazards, and driving in navigation channels is crazy dangerous because boats can go anywhere.

      My number one responsibility was to make sure it was safe. Safe from itself, safe from others, safe from its surroundings.
      My number two responsibility was to put it in difficult situations, and try to make it screw up.

      I had to have a much higher level of situational awareness than a normal boater. Anything could happen, and too late doesn't wait. Lives are at stake when you are assigned responsibility for testing such a vehicle!

      Perhaps autonomous cars are good enough to navigate roads, and need to spend a few years on a closed course with planned hazards and trained humans before they really know enough about how they are going to react to turn them loose on an unsuspecting public.

      • (Score: 0) by Anonymous Coward on Saturday June 23 2018, @10:26PM (2 children)

        by Anonymous Coward on Saturday June 23 2018, @10:26PM (#697372)

        There is already good equipment available for non-destructive testing off the highway, for example, here are some little electric robot "sleds" that are not damaged when driven over. They can roam all over a proving ground coordinated externally and can hold either a person manikin or even a "bicycle" manikin: https://www.youtube.com/watch?v=x7-SS1LxjPw [youtube.com] Same company also sells larger platforms that carry dummy cars, with correct light and radar reflectivity--they come apart when hit and snap back together. Originally developed for ADAS testing (advanced driver assist system), they work equally well for AV testing.

        • (Score: 2) by maxwell demon on Sunday June 24 2018, @09:33AM (1 child)

          by maxwell demon (1608) on Sunday June 24 2018, @09:33AM (#697489) Journal

          That's all nice, but the problem with those is that if you prearrange tests, then you test only what you thought of. At some point youhave to go out and test it in the real world, as only that will tell you how well the car copes about unexpected situations.

          --
          The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @02:07PM

            by Anonymous Coward on Sunday June 24 2018, @02:07PM (#697558)

            Of course. But a dummy/person jaywalking obliviously (with or without pushing a bicycle) across a wide road would be one of the obvious test cases for proving ground debugging. Early recognition by the AV AND a speed and/or path change to miss the pedestrian trajectory would be one reasonable result.

            I have no idea what kinds of development testing Uber has done off the public roads, but it appears they didn't do this one.

      • (Score: 2) by Fluffeh on Monday June 25 2018, @04:30AM

        by Fluffeh (954) Subscriber Badge on Monday June 25 2018, @04:30AM (#697963) Journal

        You have to keep in mind though, that this is Uber we're talking about. Not exactly a company well known for trying to "do things right" but rather "do it cheap, then improve only if you HAVE to".

        They would be paying bottom dollar for the "tester" and giving the minimum training required to get what they think is needed to put the car on the road. Their entire existence thus far could be summed up by "scale of economy and try to overpower anything else in court". This is just another bump in their lawyers billings - and with all likelihood, they will try to throw their "tester" under the bus (oh, the irony of that statement) to get out of paying extra damages or having any additional restrictions placed on their driverless testing programme.

  • (Score: 3, Troll) by Runaway1956 on Saturday June 23 2018, @04:49PM

    by Runaway1956 (2926) Subscriber Badge on Saturday June 23 2018, @04:49PM (#697228) Journal

    The whole point is - there is no such thing as a "self driving car". Not yet. Every thing that I have read, to date, stipulates that the "autopilot" or whatever may fail at any time, and that the driver should be ready to take over. And, in this particular case, the driver was hired as a "safety" backup.

    You don't get to use alpha software in mission critical situations, yawn, and go to sleep, allowing that alpha software to run amuk. The purpose of that safety driver is to help "evolve" that software into a beta state, while at the same time, ensuring that the alpha didn't kill anyone.

  • (Score: 4, Insightful) by SomeGuy on Saturday June 23 2018, @04:50PM (13 children)

    by SomeGuy (5632) on Saturday June 23 2018, @04:50PM (#697230)

    "programmed incorrectly"?

    I laugh at the idea that such a complex system could ever be programed absolutely "correctly". Most modern software is subject to weekly updates that fix a constant barrage of security issues, bugs, and may at any time introduce sloppy random buggy new mis-features at some manager's whim. Especially if your vendor decides to drop support for a car after two years or so but people keep driving them anyway.

    I personally don't even believe "self driving" cars can even become a reality until people change the way they think of roads and "driving". By necessity, roads must be thought of more like railroad tracks. If you run out on to a railroad track and get run over, then who's fault is it likely to be?

    Until "self driving", one way or the other, really becomes self driving enough that people actually CAN sit back and watch TV then it should not ever, ever be called "self driving".

    • (Score: 5, Interesting) by frojack on Saturday June 23 2018, @05:00PM (8 children)

      by frojack (1554) on Saturday June 23 2018, @05:00PM (#697237) Journal

      I laugh at the idea that such a complex system could ever be programed absolutely "correctly".

      Nothing can be. However, make light of that as you may, you can't escape this:

      http://money.cnn.com/2018/05/24/technology/uber-arizona-self-driving-report/index.html [cnn.com]

      According to the National Transportation Safety Board, Uber's self-driving car accurately identified pedestrian Elaine Herzberg, 49, as she walked a bicycle across a Tempe, Arizona, road. But Uber had turned off the vehicle's automatic emergency braking, so the SUV did not attempt to brake.

      The SUV also lacked a way to alert the human driver behind the wheel to manually brake.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 2) by RS3 on Saturday June 23 2018, @05:21PM (6 children)

        by RS3 (6367) on Saturday June 23 2018, @05:21PM (#697245)

        According to the National Transportation Safety Board, Uber's self-driving car accurately identified pedestrian Elaine Herzberg, 49, as she walked a bicycle across a Tempe, Arizona, road. But Uber had turned off the vehicle's automatic emergency braking, so the SUV did not attempt to brake.

        The SUV also lacked a way to alert the human driver behind the wheel to manually brake.

        I remember reading that, but I don't know the reasoning. I speculate that Uber did not want false-positives to cause the car to brake suddenly, for no good reason, and increase the chance of a rear-end collision.

        • (Score: 2, Insightful) by Anonymous Coward on Saturday June 23 2018, @06:11PM (2 children)

          by Anonymous Coward on Saturday June 23 2018, @06:11PM (#697289)

          Of all the collisions you can have, the rear end collision is the only one that wouldn't be the fault of Uber. Drivers are required to maintain sufficient space ahead of them that they can stop in case the car ahead of them slams on their brakes or otherwise comes to a stop.

          Being rearended is also the kind of collision for which a typical car has the best protection for the driver and the passengers.

          • (Score: 2) by frojack on Saturday June 23 2018, @07:23PM (1 child)

            by frojack (1554) on Saturday June 23 2018, @07:23PM (#697319) Journal

            Of all the collisions you can have, the rear end collision is the only one that wouldn't be the fault of Uber.

            Say what?

            If uber disabled the forward collision avoidance system, it most certainly would be their fault.

            --
            No, you are mistaken. I've always had this sig.
            • (Score: 1, Informative) by Anonymous Coward on Saturday June 23 2018, @10:37PM

              by Anonymous Coward on Saturday June 23 2018, @10:37PM (#697374)

              Woooosh?
              I read g-parent as saying, "If Uber left their version of automatic-braking on, the Uber car might stop semi-randomly (false alarm) and GET rear ended by some other car following too closely". At least in most cases, that would be the fault of the following car (driven by some unsuspecting person who wasn't expecting the Uber AV to stop at that time).

              Separate thought:
              The big mistake (imo) was that Uber didn't leave the Volvo e-brake system active, that system has already been debugged and probably would have saved the woman pushing the bicycle.

        • (Score: 5, Informative) by frojack on Saturday June 23 2018, @07:21PM (2 children)

          by frojack (1554) on Saturday June 23 2018, @07:21PM (#697318) Journal

          False positives really are not a problem with these systems.

          This is well proven technology, available for a decade on high end cars, and now filtering down to almost every brand.
          Subaru, Honda, Chevy, Ford, Standard equipment in most cases.

          On my 2012 vintage car, I've seen maybe 4 false positives, all from metal plates (Construction plates) covering the roadway, but only at the bottom of a down-grade. An alarm sounds, the dash flashes BRAKE, but before the automatic brakes kick in the system realizes its error, and does not brake, and extinguishes the Brake alarm.

          In actual danger situations my car does break authoritatively. My car detects brake requirement at least two cars ahead, even if the car immediately ahead does not brake. It braked for deer on a night so rainy and dark I couldn't see squat.

          Again, Mine is old-ish tech - 2012. More modern systems are even better at this.

          False positives, for all intents and purposes, just don't happen. Turning this off on a car you will be carrying paying passengers is just insanely irresponsible.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 1, Interesting) by Anonymous Coward on Saturday June 23 2018, @09:15PM (1 child)

            by Anonymous Coward on Saturday June 23 2018, @09:15PM (#697348)

            False positives really are not a problem with these systems.

            How so? I was an engineer on safety critical systems and false positives were treated as nearly as big of a failure as false negatives.

            • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @12:07AM

              by Anonymous Coward on Sunday June 24 2018, @12:07AM (#697393)
              And that's why false positives are not a problem on factory systems. Someone like you had them debugged.
      • (Score: 0) by Anonymous Coward on Saturday June 23 2018, @05:36PM

        by Anonymous Coward on Saturday June 23 2018, @05:36PM (#697257)

        Yeah, hacking is a thing so we really shouldn't be sending out millions of potential death machines.

    • (Score: 4, Insightful) by Gaaark on Saturday June 23 2018, @09:45PM (2 children)

      by Gaaark (41) on Saturday June 23 2018, @09:45PM (#697359) Journal

      AND, i don't want a car i have to watch over that i am not in constant control of: it IS too easy to get distracted.
      "Ooooh....gorgeous woman!"WHAM!

      If it CANNOT control itself in EVERY situation and i may be held responsible for any bad outcomes it may acquire because it suddenly relents control to me then i don't want it.

      If i'm gonna kill someone (through the car being 'at fault') and be held responsible, I WANT FULL CONTROL. Otherwise, i want exemption from prosecution.

      This "if the car cannot figure out a problem and hands control to the driver" stuff is nonsense: the human will either be ready (and is probably almost driving it himself) or he will be distracted and will not be able to assume control fast enough.

      Noooooooooooooooooooooooooope, not for me.

      --
      --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
      • (Score: 3, Insightful) by maxwell demon on Sunday June 24 2018, @09:39AM (1 child)

        by maxwell demon (1608) on Sunday June 24 2018, @09:39AM (#697491) Journal

        Maybe that's the thing Uber did wrong: They should not have had the driver just sitting there in case something gets wrong, but they should have tasked him with constantly making comments about the behaviour of the car and the current traffic situation, to be recorded alongside the car data. Even if the comments are not too useful by themselves (but who knows if they wouldn't uncover something interesting, too?), it would ensure that the driver is focused on the behaviour of the car, and thus aware of any faults it does.

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @02:16PM

          by Anonymous Coward on Sunday June 24 2018, @02:16PM (#697562)

          Excellent idea. Reminds me of high school Driver's Education (early 1970s). One of the exercises we did in the car was "commentary driving" where the student driver was instructed to comment outloud about everything (possible threats) they noticed: car approaching from rear left, about to pass | light ahead turned green | Check right mirror, glance into right blind spot, nothing close on the right at this time, could move right if the passing car comes too close | ...

          I don't think the instructor called it stream of consciousness, but that was essentially what we did.

    • (Score: 2) by realDonaldTrump on Sunday June 24 2018, @01:39AM

      by realDonaldTrump (6614) on Sunday June 24 2018, @01:39AM (#697425) Homepage Journal

      We're in the Age of Computer, the Age of Cyber, but nobody really knows what's going on. But nobody really knows what's going on inside somebody's head, either. Brain is still a HUGE mystery. The big thing is, cyber will be much cheaper. Cyber never sleeps. And cyber never needs to stop and take a leak. Crooked Hillary, did you see how long she was in the bathroom when we were supposed to be debating? Unbelievable!

      And we need the workers, believe me. No longer can we count on foreign workers coming in, that's going away very quickly. As our economy grows TREMENDOUSLY. The stock market and everything else. All those guys that are driving now -- the taxi, the truck, the limo -- we're going to need them for other jobs. Better jobs. In my Space Force and in the factories that are coming back from China, from Mexico, from all around the world. MAGA!!!