Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday January 22 2017, @02:14PM   Printer-friendly
from the not-the-NTSB dept.

Last Thursday the National Highway Traffic Safety Administration delivered the results of its investigation of the 2016 crash of Joshua Brown while he was driving a Tesla with Autopilot software.

"A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted," NHTSA's investigators found. In other words: Tesla didn't cause Brown's death.

The verdict should relieve not just the electric car builder, but the industry at large. Semi-autonomous and driver assistance technologies are more than a fresh source of revenue for automakers. They're the most promising way to cut into the more than 30,000 traffic deaths on US roads every year. Today's systems aren't perfect. They demand human oversight and can't handle everything the real world throws at them. But they're already saving lives.

NHTSA's goal wasn't to find the exact cause of the crash (that's up to the National Transportation Safety Board, which is running its own inquiry), but to root out any problems or defects with Tesla's Autopilot hardware and software.

The content of the investigation report is available from the NHTSA web site.

[Editor's note: Recently the link to the report has been returning an error occasionally. As an alternative, the Google webcache of the page is available as is a copy of the report at archive.org .]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Insightful) by Anonymous Coward on Sunday January 22 2017, @03:53PM

    by Anonymous Coward on Sunday January 22 2017, @03:53PM (#457351)

    I think this determination misses the point.
    There were no equipment failures.
    And they found the guy had at least 7 seconds to react.

    But that's too reductive. It ignores the impact of design on human behavior. At the time the Tesla software allowed the person in the driver's seat to take their hands off the wheel for very long periods of time with nothing more than an easily ignorable audio alert every once in a while. That's simply not enough to overcome natural human laziness. The car needs to verify that the driver has two hands on the wheel and if they don't an alert is not enough, it needs to slow down and try to park on the shoulder.

    So you can blame the guy for doing something stupid. But you have to blame tesla for setting up the system in a way that encourages stupidity. They are the experts with billions of dollars of expertise. They have a responsibility to know better.

    Starting Score:    0  points
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  

    Total Score:   1  
  • (Score: 2) by Runaway1956 on Sunday January 22 2017, @04:04PM

    by Runaway1956 (2926) Subscriber Badge on Sunday January 22 2017, @04:04PM (#457353) Journal

    What I hear you saying is, the machinery should be forgiving of operator negligence. But, I think the machinery did forgive the driver, as he lay dying in the tangled wreckage. Or, maybe no. Maybe the machinery was there to greet him when he arrived in negligent driver hell.

    • (Score: 0) by Anonymous Coward on Sunday January 22 2017, @04:28PM

      by Anonymous Coward on Sunday January 22 2017, @04:28PM (#457359)

      > What I hear you saying is, the machinery should be forgiving of operator negligence.

      Then you are deaf.

      I am saying that system designs should encourage good behaviour and discourage bad behaviour.

      This is no different from draconian password policies that have the unintended effect of causing people to write their passwords down on post-it notes stuck to their monitors.

      • (Score: 2) by Runaway1956 on Sunday January 22 2017, @04:35PM

        by Runaway1956 (2926) Subscriber Badge on Sunday January 22 2017, @04:35PM (#457362) Journal

        Alright. Maybe I heard you wrong. But, I did read in the report that Tesla has taken several steps to impress on the driver how important it is that he be ready to take over. It's in the owner's manual, it's in the EULA, several indicators on screen try to alert the driver, as well as warning chimes. When Tesla found that all of this was not enough, they resorted to sensors in the steering wheel. If you don't make any input into the system every so often, the car slows down - and down - and down.

        Like any other company, or product, we can watch Tesla make better and better idiot proofing - but the world continues to produce better idiots. The local hospital ran an advertisement a couple days ago, "Get your new, improved idiots here!" A competing hospital in the next county has another ad, "Our idiots are proven superior to all competitor's idiots - and they're half off through the end of the month!"

        • (Score: 4, Insightful) by Arik on Sunday January 22 2017, @05:21PM

          by Arik (4543) on Sunday January 22 2017, @05:21PM (#457376) Journal
          No matter how obvious they make it no matter how many times they say it no matter how many forms it comes in, they're still asking something fundamentally impossible.

          A human cannot stay alert and ready to take over at a moments notice like that. It's not psychologically possible. If he's in control he'll focus on the road, if he's not he'll find something else to focus on instead, whether internal or external, and he will not be in a position to take back over when the computer suddenly dumps the controls on him.

          --
          If laughter is the best medicine, who are the best doctors?
          • (Score: 3, Informative) by gringer on Sunday January 22 2017, @05:51PM

            by gringer (962) on Sunday January 22 2017, @05:51PM (#457388)

            This is not "a moment's notice", this is seven seconds, or about 200m of travel at the fastest legal road speed.

            --
            Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]
            • (Score: 2) by HiThere on Sunday January 22 2017, @09:25PM

              by HiThere (866) Subscriber Badge on Sunday January 22 2017, @09:25PM (#457439) Journal

              It may be longer than a moment's notice, but it's too short a period of time to switch your attention to the road/environs and be ready to take control. Not even if you were thinking of something else, even if your hands were on the wheel the entire time, and you were nominally looking at the road. Imagine, e.g., you were in the process of analyzing how large a hash table the program you were working on should use, and under what conditions it should be flushed to disk...now you've got 7 seconds...

              Taking control in 7 seconds is unreasonable. 20 seconds would be more reasonable, but even that would fail in tricky situations, but if a decision/reaction within 7 seconds is needed, it's better that the auto-pilot act on it's best guess.

              --
              Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
          • (Score: 3, Touché) by Thexalon on Monday January 23 2017, @10:43PM

            by Thexalon (636) on Monday January 23 2017, @10:43PM (#457837)

            If he's in control he'll focus on the road

            He will? I see all kinds of drivers of ordinary non-automated cars every day that definitely are not focused on the road.

            --
            The only thing that stops a bad guy with a compiler is a good guy with a compiler.
        • (Score: 1, Interesting) by Anonymous Coward on Sunday January 22 2017, @05:56PM

          by Anonymous Coward on Sunday January 22 2017, @05:56PM (#457390)

          Here's a user--interface design you might be better able to relate to: the safety on a gun.

          Your argument is the equivalent of saying that as long as the manual has a warning, and there is a sticker on the gun with a warning, then the gun manufacturer has done everything necessary and if the gun accidentally fires that's the owner's fault for being a dumbass.

          • (Score: 0) by Anonymous Coward on Monday January 23 2017, @07:37AM

            by Anonymous Coward on Monday January 23 2017, @07:37AM (#457565)

            Well, if it's a Glock, of course!

        • (Score: 3, Interesting) by vux984 on Sunday January 22 2017, @09:04PM

          by vux984 (5045) on Sunday January 22 2017, @09:04PM (#457429)

          But, I did read in the report that Tesla has taken several steps to impress on the driver how important it is that he be ready to take over.

          No one really disputes that.

          But you are still going against human nature by having a system that 'requires you to be ready to take over but doesn't demand that you actually do anything' for hours on end. That's not how people work, and no number of disclaimers is going to change that, no number of 'countermeasures' to try and keep tabs on people is going to acheive that.

          It's not a case of the world producing better idiots, because even perfectly normal people aren't wired to sit around 'ready but not doing anything'. Several studies have shown this.

          Either give people an autopilot that can drive itself without human intervention, that you the manufacturer are willing to take responsibility for. Or keep the automatic functions down to automatic braking and collision avoidance features to improve safety that cannot easily be gamed into 'driving for you'.

          The requirements to use Tesla's autopilot go against human nature.

          And comparing it to Airline pilots is invalid -- pilots have a much more natural and realistic relationship with the 'autopilot'. They need to be a lot less alert and engaged than a tesla driver -- a pilot can do paperwork, review aircraft manuals, and all sorts of tasks that are much lower levels of engagement. And at 30,000 feet over the ocean they aren't 7 seconds from colliding with a truck.

      • (Score: 2) by HiThere on Sunday January 22 2017, @09:18PM

        by HiThere (866) Subscriber Badge on Sunday January 22 2017, @09:18PM (#457435) Journal

        Sometimes writing down the passwords and sticking them to the monitor is the right approach. Depends on what you are guarding against. If you aren't guarding against local intrusion, but only against net based intrusion, then there's nothing wrong with that approach. Not all situations are the same. (OTOH, I'm not aware of any situation in which some "draconian password policies" are good, it's just that sometimes writing down the password is a reasonable thing to do.)

        That said, for a mass-marketed vehicle, you need to presume that some fraction of the customers will act in ways that appear, to one who understands the system, really stupid. And you need to design with that in mind.

        OTOH, this was a reasonable design bug in a complex system. It needs to be fixed, but the company shouldn't be considered culpable...this time. If they don't fix it, however...

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 4, Insightful) by Arik on Sunday January 22 2017, @04:51PM

    by Arik (4543) on Sunday January 22 2017, @04:51PM (#457366) Journal
    It's completely unworkable to have a system that runs the call 99.9% of the time but expect the human 'driver' to nonetheless pay attention and remain alert and ready to seize control back whenever there's a problem. That's simply not a model that's compatible with human psychology. Either the computer has to be able to take over to the point the 'driver' can simply fall asleep or there needs to be a full time driver with the computer only providing information rather than actually driving. If the courts understand the issue they will eventually put an end to this practice via liability.
    --
    If laughter is the best medicine, who are the best doctors?