Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday September 13 2017, @07:51PM   Printer-friendly
from the tragic-events dept.

http://abcnews.go.com/Technology/teslas-semi-autonomous-system-contributed-deadly-crash-feds/story?id=49795839

Federal investigators announced Tuesday that the design of Tesla's semiautonomous driving system allowed the driver of a Tesla Model S in a fatal 2016 crash with a semi-truck to rely too heavily on the car's automation.

"Tesla allowed the driver to use the system outside of the environment for which it was designed," said National Transportation Safety Board Chairman Robert Sumwalt. "The system gave far too much leeway to the driver to divert his attention."

The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation — or Autopilot, as Tesla calls the system. Tesla's system design was declared a contributing factor.

[...] A Tesla spokesperson provided a statement to ABC News that read, "We appreciate the NTSB's analysis of last year's tragic accident, and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."

According to The Associated Press, members of Brown's family said on Monday that they do not blame the car or the Autopilot system for his death.

A National Highway Traffic Safety Administration report on the crash can be found here. The NTSB has yet not published its full report; a synopsis of it can be found here.

Also at The Verge and CNN


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by Revek on Wednesday September 13 2017, @08:15PM (12 children)

    by Revek (5022) on Wednesday September 13 2017, @08:15PM (#567440)

    No, anyone who blames the system for the driver who didn't use it correctly is a full time idiot.

    --
    This page was generated by a Swarm of Roaming Elephants
    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Informative=1, Overrated=1, Total=3
    Extra 'Informative' Modifier   0  

    Total Score:   2  
  • (Score: 5, Insightful) by n1 on Wednesday September 13 2017, @08:36PM (9 children)

    by n1 (993) on Wednesday September 13 2017, @08:36PM (#567447) Journal

    I agree with your sentiment, but Tesla and Musk have been very keen to promote Autopilot capabilities beyond what is really appropriate, in my opinion.

    When this crash happened, Tesla website was promoting autopilot as 'automatic steering, speed, lane changing and parking'

    Around the same time elsewhere...

    Tesla removed a Chinese term for “self-driving” from its China website [reuters.com] after a driver in Beijing who crashed in “autopilot” mode complained that the car maker overplayed the function’s capability and misled buyers.

    More recently Elon musk boasted [thestreet.com] on Twitter about watching the eclipse through the glass roof on a Model S whilst using Autopilot.

    Musk told reporters that the Model S was “probably better than humans at this point in highway driving”. Before the updated autopilot was released, he said that the car was “almost able to go [between San Francisco and Seattle] without touching the controls at all”.

    Talulah Riley, Musk’s [ex]wife, shared and deleted an Instagram video of herself driving on the highway between Los Angeles and San Diego without holding the wheel. [theguardian.com]

    This PR vs Terms & Conditions gets worse when theres the ever present promotion of next week your car could get an update that makes autopilot even better/worse than it was and the intended use of the system changes while you sleep.

    • (Score: 3, Touché) by DannyB on Wednesday September 13 2017, @09:02PM (8 children)

      by DannyB (5839) Subscriber Badge on Wednesday September 13 2017, @09:02PM (#567465) Journal

      Yep. Autopilot may indeed be better than some humans at highway driving. But that doesn't make it a replacement for paying attention to the road and keeping your hand on the steering wheel while the driver is sending tweets.

      --
      The lower I set my standards the more accomplishments I have.
      • (Score: 4, Insightful) by frojack on Wednesday September 13 2017, @09:22PM (7 children)

        by frojack (1554) on Wednesday September 13 2017, @09:22PM (#567475) Journal

        The Straight Crossing Path (see page 3 of NHTSA report [nhtsa.gov] such as a car running a red light is almost impossible for any autonomous driving software to detect, especially if includes multiple lines of stopped traffic, or trees, or similar sight line issues. The crossing car is in the sight lines of the sensors for far too short a time period.

        No autopilot system on the road has been certified for this, even though BMW and Tesla and Volvo offer autopilot systems.

        Which makes it hard to believe that fully autonomous driverless cars are going to fare well in this situation either.

        People need only watch a few russian dash cam videos on YouTube to see just how frequently this happens.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 3, Insightful) by Nerdfest on Wednesday September 13 2017, @09:36PM

          by Nerdfest (80) on Wednesday September 13 2017, @09:36PM (#567484)

          People suck just as bad at that scenario though, maybe worse.

        • (Score: 3, Insightful) by BasilBrush on Wednesday September 13 2017, @10:09PM (5 children)

          by BasilBrush (3994) on Wednesday September 13 2017, @10:09PM (#567502)

          The more autonomous systems on the road, the less red lights will be jumped.

          --
          Hurrah! Quoting works now!
          • (Score: 3, Interesting) by frojack on Thursday September 14 2017, @12:45AM (4 children)

            by frojack (1554) on Thursday September 14 2017, @12:45AM (#567545) Journal

            Maybe. Maybe not. Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do. Remember, to error is human. To really fuck things up you need a computer.

            I can see improvement in this area when all cars communicate with each other indicating speed, location, and direction.
            Tinfoil sales are likely to hit boom times. But in-dash warnings of unsafe crossing conditions (even when light is green) will provide safety benefits to human drivers as well as autonomous cars.

            --
            No, you are mistaken. I've always had this sig.
            • (Score: 2) by BasilBrush on Thursday September 14 2017, @01:44AM (1 child)

              by BasilBrush (3994) on Thursday September 14 2017, @01:44AM (#567570)

              Because the number one objective is safety.

              --
              Hurrah! Quoting works now!
              • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @08:16AM

                by Anonymous Coward on Thursday September 14 2017, @08:16AM (#567694)

                It the number one objective is safety, they wouldn't have built a broken system that requires humans do the one thing that humans are especially bad at, and computers are really good at: Just sitting there, monitoring traffic, ready to take over at short notice.

            • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @04:02AM

              by Anonymous Coward on Thursday September 14 2017, @04:02AM (#567628)

              For starters, there is no AI.

            • (Score: 2) by DeathMonkey on Thursday September 14 2017, @07:46PM

              by DeathMonkey (1380) on Thursday September 14 2017, @07:46PM (#568039) Journal

              Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do.

              Because auto manufacturers aren't stupid. Well...THAT stupid anyway!

  • (Score: 2) by tfried on Thursday September 14 2017, @08:16AM

    by tfried (5534) on Thursday September 14 2017, @08:16AM (#567692)

    Depends on the type of blame you have in mind.

    From a perspective of guilt, yes, clearly the driver is at fault for misusing the system.

    From a perspective of road safety, you will realize that users will be users, and a system that can be misused will be misused. But in particular, from that perspective it also makes sense to consider, whether it is really a good idea to have a system that makes driving even more boring, while still relying on the driver to pay full attention.

    I reckon the short term answer (before there is a fully-autonomous system) will be to make the system even more obnoxious in watching the driver for signs of inattention. But that may also make the system much less popular...

  • (Score: 3, Informative) by theluggage on Thursday September 14 2017, @10:37AM

    by theluggage (1797) on Thursday September 14 2017, @10:37AM (#567724)

    No, anyone who blames the system for the driver who didn't use it correctly is a full time idiot.

    Blame and responsibility doesn't obey a conservation law: The car driver can be responsible and the truck driver can be responsible and Tesla can be responsible. Lawyers and insurance companies like there to be one victim and one villain - for their own profit and convenience - but when it comes to learning lessons for the future you really need to look at the big picture.

    I don't think anyone here is saying that the driver wasn't stupid to ignore the warnings - but would it have made any difference if his hands were on the wheel unless his mind was on the job?

    There's an inconvenient truth about self-driving: it won't be ready until it is ready. Any system in which the car drives itself, but expects Joe Public to stay alert and intervene when the computer screws up is an accident waiting to happen. Telsa's problem is that they want to crowdsource the alpha testing of their system rather than relying on expensive test drivers.