Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Saturday June 23 2018, @03:41PM   Printer-friendly
from the unfortunate dept.

According to this article on MSN:

Police in Tempe, Arizona said evidence showed the "safety" driver behind the wheel of a self-driving Uber was distracted and streaming a television show on her phone right up until about the time of a fatal accident in March, deeming the crash that rocked the nascent industry "entirely avoidable."

A 318-page report from the Tempe Police Department, released late on Thursday in response to a public records request, said the driver, Rafaela Vasquez, repeatedly looked down and not at the road, glancing up just a half second before the car hit 49-year-old Elaine Herzberg, who was crossing the street at night.

According to the report, Vasquez could face charges of vehicle manslaughter. Police said that, based on testing, the crash was "deemed entirely avoidable" if Vasquez had been paying attention.

Police obtained records from Hulu, an online service for streaming television shows and movies, which showed Vasquez's account was playing the television talent show "The Voice" the night of the crash for about 42 minutes, ending at 9:59 p.m., which "coincides with the approximate time of the collision," the report says.

It is not clear if Vasquez will be charged, and police submitted their findings to county prosecutors, who will make the determination.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Insightful) by Runaway1956 on Saturday June 23 2018, @03:51PM (37 children)

    by Runaway1956 (2926) Subscriber Badge on Saturday June 23 2018, @03:51PM (#697214) Journal

    No, not just rhetoric. This driver was paid to be where he is. That makes Vasquez a "professional". It was her professional duty to ensure that the car operated in a safe manner. She had no less responsibility than a truck driver hauling 40,000 pounds of groceries, or hazardous material, or even explosives, through town.

    This driver needs to be held up as an example. One example for professional drivers, and a secondary example for the average idiot on the highway. PAY ATTENTION TO THE ROAD!!

    Maybe in 20 years, or 100 years, or sometime, robots and computers will be smarter, maybe even intelligent, to the point that no person should ever override the judgement of the computer. That day is not here. That makes YOU, THE DRIVER, responsible.

    Dereliction of duty, reckless endangerment, wrongful death, manslaughter - and I'm sure the "big truck specialist" lawyers can come up with a helluva lot more.

    Burn her.

    • (Score: 5, Insightful) by frojack on Saturday June 23 2018, @04:33PM (26 children)

      by frojack (1554) on Saturday June 23 2018, @04:33PM (#697223) Journal

      On the other hand, it was a self driving car.

      If self driving car companies are going to hand off control (and culpability) to humans when the car is already in an emergency, then they serve not value, and should be outlawed.

      There is already information that the car was programmed incorrectly by uber, then handed to an employee who was told it was self driving.

      You can't have it both ways.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 5, Insightful) by DrkShadow on Saturday June 23 2018, @04:46PM

        by DrkShadow (1404) on Saturday June 23 2018, @04:46PM (#697226)

        There is already information that the car was programmed incorrectly by uber, then handed to an employee who was told

        The only thing it makes sense to have told the driver is that this car is in testing, will make mistakes, and you, the driver, are there to assure that a mistake by this car doesn't lead to something catastrophic. You're expected to minimize liability and be able to mitigate any and all errors should they happen.

      • (Score: 5, Informative) by maxwell demon on Saturday June 23 2018, @04:46PM (9 children)

        by maxwell demon (1608) on Saturday June 23 2018, @04:46PM (#697227) Journal

        He was a test driver. The whole reason he was employed was because the self-driving car is still in development and therefore not yet assumed to be safe for unsupervised operation. It was his job to prevent accidents like this.

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 2, Offtopic) by frojack on Saturday June 23 2018, @04:49PM (3 children)

          by frojack (1554) on Saturday June 23 2018, @04:49PM (#697229) Journal

          I know its all the rage these days to be gender fluid, but the police said the safety driver was a she.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 0, Offtopic) by Anonymous Coward on Saturday June 23 2018, @07:10PM

            by Anonymous Coward on Saturday June 23 2018, @07:10PM (#697314)

            which is particularly annoying, because it would not be politically correct to emphasize that we're dealing with a woman driver here.

          • (Score: 2) by Gaaark on Saturday June 23 2018, @09:08PM (1 child)

            by Gaaark (41) on Saturday June 23 2018, @09:08PM (#697346) Journal

            Damn, dog!
            I thought a SECOND uber car must have hit poor Elaine, because i would have testified the first driver was a man.

            D.A.M.N!

            --
            --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
            • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @02:01PM

              by Anonymous Coward on Sunday June 24 2018, @02:01PM (#697556)

              > i would have testified the first driver was a man.
              i would have testified the first driver was male.

              To be a man, he would have to man-up and apologize publicly(?)

        • (Score: 5, Insightful) by EETech1 on Saturday June 23 2018, @06:17PM (4 children)

          by EETech1 (957) on Saturday June 23 2018, @06:17PM (#697294)

          Need a +1 exactly mod for this!

          I used to do integration testing of a certain Marine drive-by-wire system with various aftermarket autopilots, and while it was really nice to sit in the breeze while being driven around the area on an endless scenic tour, there's countless things that constantly go wrong, especially during active development. It could be an entirely different vehicle after lunch. Same drivers seat, steering wheel, buttons etc. But deep inside there, the software running it all was changed (release notes FTW) so this time around the lake is not like the last!

          Even out on the open water, there's still hazards, and driving in navigation channels is crazy dangerous because boats can go anywhere.

          My number one responsibility was to make sure it was safe. Safe from itself, safe from others, safe from its surroundings.
          My number two responsibility was to put it in difficult situations, and try to make it screw up.

          I had to have a much higher level of situational awareness than a normal boater. Anything could happen, and too late doesn't wait. Lives are at stake when you are assigned responsibility for testing such a vehicle!

          Perhaps autonomous cars are good enough to navigate roads, and need to spend a few years on a closed course with planned hazards and trained humans before they really know enough about how they are going to react to turn them loose on an unsuspecting public.

          • (Score: 0) by Anonymous Coward on Saturday June 23 2018, @10:26PM (2 children)

            by Anonymous Coward on Saturday June 23 2018, @10:26PM (#697372)

            There is already good equipment available for non-destructive testing off the highway, for example, here are some little electric robot "sleds" that are not damaged when driven over. They can roam all over a proving ground coordinated externally and can hold either a person manikin or even a "bicycle" manikin: https://www.youtube.com/watch?v=x7-SS1LxjPw [youtube.com] Same company also sells larger platforms that carry dummy cars, with correct light and radar reflectivity--they come apart when hit and snap back together. Originally developed for ADAS testing (advanced driver assist system), they work equally well for AV testing.

            • (Score: 2) by maxwell demon on Sunday June 24 2018, @09:33AM (1 child)

              by maxwell demon (1608) on Sunday June 24 2018, @09:33AM (#697489) Journal

              That's all nice, but the problem with those is that if you prearrange tests, then you test only what you thought of. At some point youhave to go out and test it in the real world, as only that will tell you how well the car copes about unexpected situations.

              --
              The Tao of math: The numbers you can count are not the real numbers.
              • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @02:07PM

                by Anonymous Coward on Sunday June 24 2018, @02:07PM (#697558)

                Of course. But a dummy/person jaywalking obliviously (with or without pushing a bicycle) across a wide road would be one of the obvious test cases for proving ground debugging. Early recognition by the AV AND a speed and/or path change to miss the pedestrian trajectory would be one reasonable result.

                I have no idea what kinds of development testing Uber has done off the public roads, but it appears they didn't do this one.

          • (Score: 2) by Fluffeh on Monday June 25 2018, @04:30AM

            by Fluffeh (954) Subscriber Badge on Monday June 25 2018, @04:30AM (#697963) Journal

            You have to keep in mind though, that this is Uber we're talking about. Not exactly a company well known for trying to "do things right" but rather "do it cheap, then improve only if you HAVE to".

            They would be paying bottom dollar for the "tester" and giving the minimum training required to get what they think is needed to put the car on the road. Their entire existence thus far could be summed up by "scale of economy and try to overpower anything else in court". This is just another bump in their lawyers billings - and with all likelihood, they will try to throw their "tester" under the bus (oh, the irony of that statement) to get out of paying extra damages or having any additional restrictions placed on their driverless testing programme.

      • (Score: 3, Troll) by Runaway1956 on Saturday June 23 2018, @04:49PM

        by Runaway1956 (2926) Subscriber Badge on Saturday June 23 2018, @04:49PM (#697228) Journal

        The whole point is - there is no such thing as a "self driving car". Not yet. Every thing that I have read, to date, stipulates that the "autopilot" or whatever may fail at any time, and that the driver should be ready to take over. And, in this particular case, the driver was hired as a "safety" backup.

        You don't get to use alpha software in mission critical situations, yawn, and go to sleep, allowing that alpha software to run amuk. The purpose of that safety driver is to help "evolve" that software into a beta state, while at the same time, ensuring that the alpha didn't kill anyone.

      • (Score: 4, Insightful) by SomeGuy on Saturday June 23 2018, @04:50PM (13 children)

        by SomeGuy (5632) on Saturday June 23 2018, @04:50PM (#697230)

        "programmed incorrectly"?

        I laugh at the idea that such a complex system could ever be programed absolutely "correctly". Most modern software is subject to weekly updates that fix a constant barrage of security issues, bugs, and may at any time introduce sloppy random buggy new mis-features at some manager's whim. Especially if your vendor decides to drop support for a car after two years or so but people keep driving them anyway.

        I personally don't even believe "self driving" cars can even become a reality until people change the way they think of roads and "driving". By necessity, roads must be thought of more like railroad tracks. If you run out on to a railroad track and get run over, then who's fault is it likely to be?

        Until "self driving", one way or the other, really becomes self driving enough that people actually CAN sit back and watch TV then it should not ever, ever be called "self driving".

        • (Score: 5, Interesting) by frojack on Saturday June 23 2018, @05:00PM (8 children)

          by frojack (1554) on Saturday June 23 2018, @05:00PM (#697237) Journal

          I laugh at the idea that such a complex system could ever be programed absolutely "correctly".

          Nothing can be. However, make light of that as you may, you can't escape this:

          http://money.cnn.com/2018/05/24/technology/uber-arizona-self-driving-report/index.html [cnn.com]

          According to the National Transportation Safety Board, Uber's self-driving car accurately identified pedestrian Elaine Herzberg, 49, as she walked a bicycle across a Tempe, Arizona, road. But Uber had turned off the vehicle's automatic emergency braking, so the SUV did not attempt to brake.

          The SUV also lacked a way to alert the human driver behind the wheel to manually brake.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 2) by RS3 on Saturday June 23 2018, @05:21PM (6 children)

            by RS3 (6367) on Saturday June 23 2018, @05:21PM (#697245)

            According to the National Transportation Safety Board, Uber's self-driving car accurately identified pedestrian Elaine Herzberg, 49, as she walked a bicycle across a Tempe, Arizona, road. But Uber had turned off the vehicle's automatic emergency braking, so the SUV did not attempt to brake.

            The SUV also lacked a way to alert the human driver behind the wheel to manually brake.

            I remember reading that, but I don't know the reasoning. I speculate that Uber did not want false-positives to cause the car to brake suddenly, for no good reason, and increase the chance of a rear-end collision.

            • (Score: 2, Insightful) by Anonymous Coward on Saturday June 23 2018, @06:11PM (2 children)

              by Anonymous Coward on Saturday June 23 2018, @06:11PM (#697289)

              Of all the collisions you can have, the rear end collision is the only one that wouldn't be the fault of Uber. Drivers are required to maintain sufficient space ahead of them that they can stop in case the car ahead of them slams on their brakes or otherwise comes to a stop.

              Being rearended is also the kind of collision for which a typical car has the best protection for the driver and the passengers.

              • (Score: 2) by frojack on Saturday June 23 2018, @07:23PM (1 child)

                by frojack (1554) on Saturday June 23 2018, @07:23PM (#697319) Journal

                Of all the collisions you can have, the rear end collision is the only one that wouldn't be the fault of Uber.

                Say what?

                If uber disabled the forward collision avoidance system, it most certainly would be their fault.

                --
                No, you are mistaken. I've always had this sig.
                • (Score: 1, Informative) by Anonymous Coward on Saturday June 23 2018, @10:37PM

                  by Anonymous Coward on Saturday June 23 2018, @10:37PM (#697374)

                  Woooosh?
                  I read g-parent as saying, "If Uber left their version of automatic-braking on, the Uber car might stop semi-randomly (false alarm) and GET rear ended by some other car following too closely". At least in most cases, that would be the fault of the following car (driven by some unsuspecting person who wasn't expecting the Uber AV to stop at that time).

                  Separate thought:
                  The big mistake (imo) was that Uber didn't leave the Volvo e-brake system active, that system has already been debugged and probably would have saved the woman pushing the bicycle.

            • (Score: 5, Informative) by frojack on Saturday June 23 2018, @07:21PM (2 children)

              by frojack (1554) on Saturday June 23 2018, @07:21PM (#697318) Journal

              False positives really are not a problem with these systems.

              This is well proven technology, available for a decade on high end cars, and now filtering down to almost every brand.
              Subaru, Honda, Chevy, Ford, Standard equipment in most cases.

              On my 2012 vintage car, I've seen maybe 4 false positives, all from metal plates (Construction plates) covering the roadway, but only at the bottom of a down-grade. An alarm sounds, the dash flashes BRAKE, but before the automatic brakes kick in the system realizes its error, and does not brake, and extinguishes the Brake alarm.

              In actual danger situations my car does break authoritatively. My car detects brake requirement at least two cars ahead, even if the car immediately ahead does not brake. It braked for deer on a night so rainy and dark I couldn't see squat.

              Again, Mine is old-ish tech - 2012. More modern systems are even better at this.

              False positives, for all intents and purposes, just don't happen. Turning this off on a car you will be carrying paying passengers is just insanely irresponsible.

              --
              No, you are mistaken. I've always had this sig.
              • (Score: 1, Interesting) by Anonymous Coward on Saturday June 23 2018, @09:15PM (1 child)

                by Anonymous Coward on Saturday June 23 2018, @09:15PM (#697348)

                False positives really are not a problem with these systems.

                How so? I was an engineer on safety critical systems and false positives were treated as nearly as big of a failure as false negatives.

                • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @12:07AM

                  by Anonymous Coward on Sunday June 24 2018, @12:07AM (#697393)
                  And that's why false positives are not a problem on factory systems. Someone like you had them debugged.
          • (Score: 0) by Anonymous Coward on Saturday June 23 2018, @05:36PM

            by Anonymous Coward on Saturday June 23 2018, @05:36PM (#697257)

            Yeah, hacking is a thing so we really shouldn't be sending out millions of potential death machines.

        • (Score: 4, Insightful) by Gaaark on Saturday June 23 2018, @09:45PM (2 children)

          by Gaaark (41) on Saturday June 23 2018, @09:45PM (#697359) Journal

          AND, i don't want a car i have to watch over that i am not in constant control of: it IS too easy to get distracted.
          "Ooooh....gorgeous woman!"WHAM!

          If it CANNOT control itself in EVERY situation and i may be held responsible for any bad outcomes it may acquire because it suddenly relents control to me then i don't want it.

          If i'm gonna kill someone (through the car being 'at fault') and be held responsible, I WANT FULL CONTROL. Otherwise, i want exemption from prosecution.

          This "if the car cannot figure out a problem and hands control to the driver" stuff is nonsense: the human will either be ready (and is probably almost driving it himself) or he will be distracted and will not be able to assume control fast enough.

          Noooooooooooooooooooooooooope, not for me.

          --
          --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
          • (Score: 3, Insightful) by maxwell demon on Sunday June 24 2018, @09:39AM (1 child)

            by maxwell demon (1608) on Sunday June 24 2018, @09:39AM (#697491) Journal

            Maybe that's the thing Uber did wrong: They should not have had the driver just sitting there in case something gets wrong, but they should have tasked him with constantly making comments about the behaviour of the car and the current traffic situation, to be recorded alongside the car data. Even if the comments are not too useful by themselves (but who knows if they wouldn't uncover something interesting, too?), it would ensure that the driver is focused on the behaviour of the car, and thus aware of any faults it does.

            --
            The Tao of math: The numbers you can count are not the real numbers.
            • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @02:16PM

              by Anonymous Coward on Sunday June 24 2018, @02:16PM (#697562)

              Excellent idea. Reminds me of high school Driver's Education (early 1970s). One of the exercises we did in the car was "commentary driving" where the student driver was instructed to comment outloud about everything (possible threats) they noticed: car approaching from rear left, about to pass | light ahead turned green | Check right mirror, glance into right blind spot, nothing close on the right at this time, could move right if the passing car comes too close | ...

              I don't think the instructor called it stream of consciousness, but that was essentially what we did.

        • (Score: 2) by realDonaldTrump on Sunday June 24 2018, @01:39AM

          by realDonaldTrump (6614) on Sunday June 24 2018, @01:39AM (#697425) Homepage Journal

          We're in the Age of Computer, the Age of Cyber, but nobody really knows what's going on. But nobody really knows what's going on inside somebody's head, either. Brain is still a HUGE mystery. The big thing is, cyber will be much cheaper. Cyber never sleeps. And cyber never needs to stop and take a leak. Crooked Hillary, did you see how long she was in the bathroom when we were supposed to be debating? Unbelievable!

          And we need the workers, believe me. No longer can we count on foreign workers coming in, that's going away very quickly. As our economy grows TREMENDOUSLY. The stock market and everything else. All those guys that are driving now -- the taxi, the truck, the limo -- we're going to need them for other jobs. Better jobs. In my Space Force and in the factories that are coming back from China, from Mexico, from all around the world. MAGA!!!

    • (Score: 3, Insightful) by bzipitidoo on Saturday June 23 2018, @05:09PM (9 children)

      by bzipitidoo (4388) on Saturday June 23 2018, @05:09PM (#697241) Journal

      Chill on the Personal Responsibility cult for a sec, bro.

      The self-driving software blew it. Had it detected the pedestrian, then it wouldn't have mattered what the human was doing. Further, is not the human present to take over if the software gets confused and sounds an alarm?

      Next, the pedestrian was jaywalking. She was not in a crosswalk. And it was dark. The article doesn't say if she was wearing bright colors, but it does say she was homeless so I think it likely she was in dingy, dull, dirty clothes. Her bike should have had reflectors on it at the least, but if the bike is not well maintained, reflectors would certainly be regarded as an unnecessary luxury. Quite possible the accident would have happened even if the driver had been paying attention.

      Another party to blame is society, for several things. First, why was that pedestrian homeless? If she had mental problems, she should have been getting help for that.

      Should the pedestrian have looked both ways and seen the car coming? Being homeless in America might have made her depressed and reckless to a suicidal degree, and she might not have cared much. America is especially vicious about shaming the downtrodden. Apt to assume it's your fault that you're homeless, think of you as a loser, and treat you with disgust and contempt. After all, we have this Prosperity Gospel idea that says the poor must have sinned. They deserve being poor; God is punishing them with poverty. No Christian Charity there, nope!

      Third, America has had such a long love affair with the car that we have neglected pedestrians. Walking is sooo low class. The neglect is especially severe in southerly states such as Arizona, and that's because A/C first became widely available in the 1950s, at the peak of the automobile transportation era, when society was trying drive-in everything-- theaters, restaurants, and who knows what else. Before A/C, the south was much more lightly settled. People didn't want to put up with the summer heat. So all that growth in the south happened when the car was king. It's a total pain to get anywhere on foot in most US cities, especially southern ones. Lot of bridges were built without a sidewalk. Adds very little cost to make a bridge wider by the width of a walkway. Public transportation is spotty. Thankfully, there's more awareness now, and things are improving for pedestrians.

      So, yeah, lots of blame to go around.

      • (Score: 5, Insightful) by frojack on Saturday June 23 2018, @05:30PM (7 children)

        by frojack (1554) on Saturday June 23 2018, @05:30PM (#697252) Journal

        The self-driving software blew it. Had it detected the pedestrian, then it wouldn't have mattered what the human was doing. Further, is not the human present to take over if the software gets confused and sounds an alarm?

        Wait, wait, wait...

        The self driving software DID detect the pedestrian, over 6 seconds ahead of time.

        In fact The Volvo XC90’s standard advanced driver-assistance system installed on the car (cameras and radar) would have 1) Alerted the driver, and 2) braked the car to a full stop, had not Uber Techs disabled it. So not just negligence, but actively disabling a safety system.
        A stock Volvo XC90 would have avoided this accident!

        The Uber self driving system was not designed to alert the driver of a need to brake, and it didn't trigger any brakes itself.

        The driver had duties that required taking her eyes off the road, such as monitoring the console of streaming messages from the system. She said she was performing those duties, not watching a video.

        https://www.bloomberg.com/news/articles/2018-03-26/uber-disabled-volvo-suv-s-standard-safety-system-before-fatality [bloomberg.com]
        https://www.cbc.ca/news/business/uber-arizona-crash-1.4594939 [www.cbc.ca]

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 1, Informative) by Anonymous Coward on Saturday June 23 2018, @06:25PM (6 children)

          by Anonymous Coward on Saturday June 23 2018, @06:25PM (#697298)

          Yes and with 6 seconds, that should have been sufficient for the driver to get eyes on the road and slow. Possibly to a complete stop, but definitely down to the point where the likelihood of a fatality was significantly reduced.

          • (Score: 2) by Runaway1956 on Sunday June 24 2018, @12:39AM (5 children)

            by Runaway1956 (2926) Subscriber Badge on Sunday June 24 2018, @12:39AM (#697407) Journal

            In all honesty, six seconds is not enough to stop. Perhaps, one day, I'll re-write a real life story in my journal that took place in Texas Canyon. (My original story is on a site that is no longer maintained, and I can't get at it.)

            But, six seconds is more than adequate time to twitch the steering wheel, and to change lanes, while at the same time slowing. The difference of ten or fifteen mph can be the difference between life and death upon impact, if impact is unavoidable. Six seconds is most definitely enough time to avoid an accident, in many cases. Six seconds is definitely enough time to lessen the seriousness of an accident.

            • (Score: 2) by schad on Sunday June 24 2018, @02:34AM (3 children)

              by schad (2398) on Sunday June 24 2018, @02:34AM (#697437)

              From what I've found, a Volvo XC90 can decelerate to a stop from 62mph in 3.1 seconds. In that time it will have traveled 138 feet. Better cars can do it in closer to 110 feet (about 2.5 seconds at 1.1g), but that's probably respectable for a big heavy SUV.

              In short, six seconds is plenty of time. You could probably come to a stop from 80 or maybe even 100 in six seconds, even accounting for human reaction time.

              • (Score: 2) by opinionated_science on Sunday June 24 2018, @04:17AM

                by opinionated_science (4031) on Sunday June 24 2018, @04:17AM (#697455)

                especially, since modern anti-skid is so effective. I have only used mine once for the intended purpose, and I was genuinely suprised how effective it was.

                Toyota V6 Camry, includes anti-skid, and I keep the tyres in good condition - Interstate 40, decelerate 70-stop and the 200 cars ahead had at least 5 collisions. Fortunately, nothing serious.

                Just remember, seat belts, collapsible steering columns, laminated windscreens, jointed transmission shafts ,collapsible body frames - had a huge effect on survivable collisions before 1980 ( feel free to add to this list).

                Self driving cars, if implemented properly (i.e. fully tested in increasingly less controlled conditions) could say an enormous amount of injury and death.

              • (Score: 2) by Runaway1956 on Sunday June 24 2018, @11:20AM (1 child)

                by Runaway1956 (2926) Subscriber Badge on Sunday June 24 2018, @11:20AM (#697507) Journal

                I don't see where you're including recognition of a hazard, and reaction time in your numbers. If the only thing being considered is the physics of the vehicle, that is, the time and distance AFTER full braking power is applied, then I can accept your numbers.

                In real life, that "safety driver" wasn't fully alert 100% of the time, as others have pointed out. Uber's alarm should have alerted the driver, but it takes a moment to refocus full attention on the road, another moment to recognize the hazard, another to reach for the brakes, and perhaps another moment still to really stand on the brakes for a panic stop. Moments ticking by, while the car continues at or near full speed.

                I could forgive that "safety driver" if he had merely allowed his mind to wander a little bit. That happens to everyone, professional or not. But he was WATCHING A MOVIE!! That is plain and simple dereliction of duty. And, that certainly had an effect on his/her reactions when the emergency occurred.

                If the driver had been paying attention, had he recognized the emergency for what it was, he probably wouldn't have tried to brake at all. Instead, the car could have been steered around the obstacle.

                To the best of my knowledge, there was little traffice, and no traffic close enough to be a hazard in the next lane over.

                I think that all of us have chosen to steer around a hazard, rather than brake for it, haven't we? I have many times!

                • (Score: 2) by schad on Monday June 25 2018, @12:28AM

                  by schad (2398) on Monday June 25 2018, @12:28AM (#697848)

                  I didn't include that stuff because it didn't matter: the driver was negligently inattentive, and the car had no ability to warn her. If the car had somehow noticed six minutes in advance, it would have made no difference.

                  My point was only that six seconds is actually plenty of time to react, assuming that everyone involved actually had the ability to do so. If the driver had been paying attention, six seconds is plenty of time to come to a stop. If the car could sound an alarm, even an initially-inattentive driver could realized the problem and come to a stop (or at least slow enough for the pedestrian to be far more likely to survive). And finally, if the car had the ability to apply the brakes on its own, six seconds is enough time to do so without even needing to "panic stop" (which is dangerous in its own right).

            • (Score: 2) by Thexalon on Sunday June 24 2018, @02:22PM

              by Thexalon (636) on Sunday June 24 2018, @02:22PM (#697563)

              In all honesty, six seconds is not enough to stop.

              According to the math [nacto.org], in most vehicles the stopping time is in the approximate range of 4 seconds. That can be a very significant distance (in the 0.1 miles range) if you started at highway speeds, and I'm assuming relatively dry road conditions, but it's not impossible to stop in 4 seconds if the human is reacting reasonably quickly. The human in this case was not reacting reasonably quickly because they were busy watching their show and not watching the road.

              And that's why yellow traffic lights are typically in the range of 3-4 seconds, time enough to either stop or accelerate through.

              --
              The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 2, Insightful) by Anonymous Coward on Saturday June 23 2018, @06:11PM

        by Anonymous Coward on Saturday June 23 2018, @06:11PM (#697290)

        That area isn't as dark as Uber's video made it out to be ( https://arstechnica.com/cars/2018/03/police-chief-said-uber-victim-came-from-the-shadows-dont-believe-it/ [arstechnica.com] ).

        The car had plenty of time to decide to stop or slow down to make the collision non-fatal. I've done far better at emergency stops and I'm far from the best driver on the road.

        Uber should be banned from testing or making self driving cars for 5 years. With their sort of crappy corporate culture they probably slapped on something like Tesla's autopilot with some mods and then called it self-driving just for the publicity and PR.

        Let the more responsible contenders with actual self-driving car tech continue.

  • (Score: 5, Insightful) by Anonymous Coward on Saturday June 23 2018, @05:11PM (9 children)

    by Anonymous Coward on Saturday June 23 2018, @05:11PM (#697243)

    Bear in mind that you are asking something VERY DIFFICULT, NIGH IMPOSSIBLE of the test driver: stay alert at all times, even to the point of instant reaction, while doing absolutely nothing 99% of the rest of the time.

    I agree she should not have streamed a movie, but the human mind must have something to do other than be on high vigilance for hours at a time. Ironically, driving the car would have given her something to do and would have engaged her with her environment to where the collision would probably not have happened. You can't let the car drive itself and yet be simultaneously responsible for driving it yourself at a moment's notice. You are either driving the car or you are not. I know this woman was a tester, but this fundamental problem would still exist in production. Anything less than a car that totally takes care of itself in "self"-driving mode is unacceptable.

    • (Score: 2) by Fnord666 on Saturday June 23 2018, @07:44PM (4 children)

      by Fnord666 (652) on Saturday June 23 2018, @07:44PM (#697325) Homepage
      But they're Agile! They just have to deliver a Minimum Viable Product, right? I'm sure safety is on the development roadmap in the near future. Maybe someone could write up a quick user story?
      • (Score: 2, Informative) by Anonymous Coward on Saturday June 23 2018, @09:28PM (3 children)

        by Anonymous Coward on Saturday June 23 2018, @09:28PM (#697352)

        Airline pilots are trained for hundreds of hours to be able to take over from the automatic systems and they typically have minutes to do so when they have to.

        • (Score: 1) by tftp on Sunday June 24 2018, @12:20AM

          by tftp (806) on Sunday June 24 2018, @12:20AM (#697402) Homepage

          Here is a highly relevant video:

          https://m.youtube.com/watch?v=KK5KTQGuXSQ [youtube.com]

          Watch from the beginning, the event at 00:50.

        • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @12:28AM (1 child)

          by Anonymous Coward on Sunday June 24 2018, @12:28AM (#697404)

          Yes, they have minutes. Automobile drivers have SECONDS.
          MASSIVE difference.

          • (Score: 2) by maxwell demon on Sunday June 24 2018, @09:49AM

            by maxwell demon (1608) on Sunday June 24 2018, @09:49AM (#697494) Journal

            Have you watched the video linked by the sibling post? That definitely was not minutes of reaction time.

            --
            The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 1, Interesting) by Anonymous Coward on Saturday June 23 2018, @09:25PM

      by Anonymous Coward on Saturday June 23 2018, @09:25PM (#697350)

      This was something the aircraft industry figured out long ago. You are correct, as other car companies have been, in pointing out that the human brain is not a machine and can't be expected to work like one. I don't remember what the numbers are, but it is on the order of seconds to come up to situational awareness.

      http://raes-hfg.com/crm/reports/sa-defns.pdf [raes-hfg.com]

      See also human factors: Human Factors Analysis and Classification System

      Adverse Mental State: Refers to factors that include those mental conditions that affect performance (e.g., stress, mental fatigue, motivation).
              Adverse Physiological State: Refers to factors that include those medical or physiological conditions that affect performance (e.g. medical illness, physical fatigue, hypoxia).
              Physical/Mental Limitation: Refers to when an operator lacks the physical or mental capabilities to cope with a situation, and this affects performance (e.g. visual limitations, insufficient reaction time).

    • (Score: 4, Interesting) by Gaaark on Saturday June 23 2018, @09:53PM (2 children)

      by Gaaark (41) on Saturday June 23 2018, @09:53PM (#697362) Journal

      Fully and completely agree.

      Either let me drive or make me not criminally (or anywise) responsible. DO NOT make me a passive assist driver who has to take control when the car comes upon a situation it can't figure out.
      The LIDAR saw her six seconds out (LIDAR doesn't care how light or dark the roadway is): unless the driver is at the point of almost driving the vehicle themselves, the person still would have been at least injured, probably.

      UBER should be banned from doing this and there should be criminal proceedings against them for 'tuning' the software.

      --
      --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
      • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @06:41PM (1 child)

        by Anonymous Coward on Sunday June 24 2018, @06:41PM (#697654)

        LIDAR isn't magic though. Even if someone is within view of the LIDAR, there is some serious Computer Vision involved in segmenting and "detecting" that person amid all the noise and things you don't want to detect.

        • (Score: 2) by Gaaark on Sunday June 24 2018, @11:10PM

          by Gaaark (41) on Sunday June 24 2018, @11:10PM (#697787) Journal

          But it's also Ubers fault for turning it down so it wouldn't brake for things it couldn't identify.

          As @Apparition(?) said elsewhere, they should have had the driver identifying everything the car braked for.

          Have a heads-up display show what it was braking for and have the driver identify it for later research, BUT brake first.

          --
          --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
  • (Score: 2, Insightful) by Anonymous Coward on Saturday June 23 2018, @05:33PM (2 children)

    by Anonymous Coward on Saturday June 23 2018, @05:33PM (#697254)

    Why would the person who is in effect a passenger, be liable for watching TV? The car was supposed to take care of itself.

    • (Score: 0) by Anonymous Coward on Saturday June 23 2018, @07:11PM

      by Anonymous Coward on Saturday June 23 2018, @07:11PM (#697315)

      The Tesla-semitrailer crash in NY had a similar element.

      Joshua Brown, driving in a rural setting where there was little traffic, set the system for 74 MPH [google.com] and settled in to watch Harry Potter, [google.com] ignoring repeated warnings [google.com] from the system to put his goddamned hands on the steering wheel.

      Darwin Award.

      -- OriginalOwner_ [soylentnews.org]

    • (Score: 1, Insightful) by Anonymous Coward on Saturday June 23 2018, @11:55PM

      by Anonymous Coward on Saturday June 23 2018, @11:55PM (#697390)

      Why would the person who is in effect a passenger, be liable for watching TV? The car was supposed to take care of itself.

      Because the car was known to not yet be reliable enough to take care of itself. The person was not a passenger. It was literally her job to be ready to jump in and take control when -- not if -- the car encountered a situation where it did not respond correctly.

      Being in that situation would be mind-numbing, I have no doubt, especially if the autonomous software was approaching "pretty good most of the time" stage. Something to keep entertained, like audiobooks, might have been a good way to wile away the hours. Something that takes both eyes and ears off the road, like watching TV, is about the stupidest thing a safety driver could do short of taking a nap.

  • (Score: 1, Informative) by Anonymous Coward on Saturday June 23 2018, @06:38PM (1 child)

    by Anonymous Coward on Saturday June 23 2018, @06:38PM (#697303)

    The article doesn't actually say that the safety driver was watching Hulu.

    It says that someone was watching Hulu using the driver's Hulu account, and that they happened to stop watching shortly after the collision. It is entirely plausible that the person watching was not the driver, but perhaps was the driver's spouse or a friend?

    Unfortunately the article does not actually link to the police report.

    • (Score: 0) by Anonymous Coward on Saturday June 23 2018, @10:54PM

      by Anonymous Coward on Saturday June 23 2018, @10:54PM (#697376)

      And they were watching 'The Voice'. If it was her watching, I wonder who was singing at the time.

      From the shocked look on the Uber video when the car hit the pedestrian, imagine she'd just had the double shock of some stupid 'coach' comment or some utterly shite robotic try-hard style singing immediately before the collision.

      Her nightmares would be not just of the collision, but of whatever crap was on the Hulu. She'll forever fear using the Cyber!

  • (Score: 1) by AlphaSnail on Sunday June 24 2018, @12:27AM (2 children)

    by AlphaSnail (5814) on Sunday June 24 2018, @12:27AM (#697403)

    If I run into the street without looking and get hit by a car - I killed myself. If I on foot moved into traffic on purpose expecting it to swerve around me but it fails to do so and I am struck and killed, I killed myself. But if a robot car is driving all of a sudden, it's the cars fault? All of this "the tech isn't safe yet" discussion ignores that this person made a suicidal move regardless of how the vehicle was operating. I will be more concerned when I hear stories of self driving cars jumping up on curbs and killing people on the sidewalk or driving off of bridges killing the occupants. If your going to run into the street in front of moving vehicles and you die well tough. If it was an 18 wheeler would you expect it to stop on a dime when someone darted in front of it because they didn't understand the concept of look both ways? Is the automated vehicle supposed to swerve and possibly cause someone else to have a potentially fatal accident just to get out of the way of someone who cares so little about life they risk their own and other's just so they can get across the street without looking first? I say unless the robot car swerved into you from its otherwise straight trajectory you are at fault for getting in the way. Can we make that a law? If you get in front of a moving vehicle of your own free will then you are at fault irregardless of whatever is driving it be it man or machine. The punishment can be your injuries, how's that? I think people are picturing these cars mowing down children chasing balls in the street but places that cars drive fast aren't places children should be playing in the first place, and where they are the robot cars should be driving slow enough to easily stop so I don't think the 'danger' is as real as people are projecting.

    • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @12:40AM

      by Anonymous Coward on Sunday June 24 2018, @12:40AM (#697408)

      Drivers, human or otherwise, are expected to avoid hitting pedestrians.
      This vehicle, as configured, had the ability to avoid hitting the pedestrian but WAS DISABLED FROM DOING SO by Uber.
      If I place blame anywhere, it is on Uber.

    • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @01:10AM

      by Anonymous Coward on Sunday June 24 2018, @01:10AM (#697413)

      > If I run into the street without looking and get hit by a car - I killed myself.

      While I tend to agree with you, that is not the way the law is written in many areas. Hitting a pedestrian (no matter what they were doing) is first the fault of the motor vehicle operator.

  • (Score: 0) by Anonymous Coward on Sunday June 24 2018, @03:53PM

    by Anonymous Coward on Sunday June 24 2018, @03:53PM (#697593)

    Law should be a man waving a flag must walk in front of all self driving cars.

    Or, at least, bright flashing strobes when a car is in autopilot.
    Perhaps some sort of transponder broadcasting it's position, like planes and boats do so people can tell if they are on a collision course.

    Alert pedestrians and cyclists that 'HEY, NO BODY IS DRIVING THIS VEHICLE' - FLASH FLASH FLASH FLASH BEEP BEEP BEEP BEEP

    End of the day, it sounds like this car decided to plow over the lady, instead of braking.

    Yes distracted driving is a problem, but why was that lady in a death machine to begin with?

(1)