Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by Fnord666 on Wednesday September 13 2017, @07:51PM   Printer-friendly
from the tragic-events dept.

http://abcnews.go.com/Technology/teslas-semi-autonomous-system-contributed-deadly-crash-feds/story?id=49795839

Federal investigators announced Tuesday that the design of Tesla's semiautonomous driving system allowed the driver of a Tesla Model S in a fatal 2016 crash with a semi-truck to rely too heavily on the car's automation.

"Tesla allowed the driver to use the system outside of the environment for which it was designed," said National Transportation Safety Board Chairman Robert Sumwalt. "The system gave far too much leeway to the driver to divert his attention."

The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation — or Autopilot, as Tesla calls the system. Tesla's system design was declared a contributing factor.

[...] A Tesla spokesperson provided a statement to ABC News that read, "We appreciate the NTSB's analysis of last year's tragic accident, and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."

According to The Associated Press, members of Brown's family said on Monday that they do not blame the car or the Autopilot system for his death.

A National Highway Traffic Safety Administration report on the crash can be found here. The NTSB has yet not published its full report; a synopsis of it can be found here.

Also at The Verge and CNN


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by n1 on Wednesday September 13 2017, @08:36PM (9 children)

    by n1 (993) on Wednesday September 13 2017, @08:36PM (#567447) Journal

    I agree with your sentiment, but Tesla and Musk have been very keen to promote Autopilot capabilities beyond what is really appropriate, in my opinion.

    When this crash happened, Tesla website was promoting autopilot as 'automatic steering, speed, lane changing and parking'

    Around the same time elsewhere...

    Tesla removed a Chinese term for “self-driving” from its China website [reuters.com] after a driver in Beijing who crashed in “autopilot” mode complained that the car maker overplayed the function’s capability and misled buyers.

    More recently Elon musk boasted [thestreet.com] on Twitter about watching the eclipse through the glass roof on a Model S whilst using Autopilot.

    Musk told reporters that the Model S was “probably better than humans at this point in highway driving”. Before the updated autopilot was released, he said that the car was “almost able to go [between San Francisco and Seattle] without touching the controls at all”.

    Talulah Riley, Musk’s [ex]wife, shared and deleted an Instagram video of herself driving on the highway between Los Angeles and San Diego without holding the wheel. [theguardian.com]

    This PR vs Terms & Conditions gets worse when theres the ever present promotion of next week your car could get an update that makes autopilot even better/worse than it was and the intended use of the system changes while you sleep.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Informative=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 3, Touché) by DannyB on Wednesday September 13 2017, @09:02PM (8 children)

    by DannyB (5839) Subscriber Badge on Wednesday September 13 2017, @09:02PM (#567465) Journal

    Yep. Autopilot may indeed be better than some humans at highway driving. But that doesn't make it a replacement for paying attention to the road and keeping your hand on the steering wheel while the driver is sending tweets.

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 4, Insightful) by frojack on Wednesday September 13 2017, @09:22PM (7 children)

      by frojack (1554) on Wednesday September 13 2017, @09:22PM (#567475) Journal

      The Straight Crossing Path (see page 3 of NHTSA report [nhtsa.gov] such as a car running a red light is almost impossible for any autonomous driving software to detect, especially if includes multiple lines of stopped traffic, or trees, or similar sight line issues. The crossing car is in the sight lines of the sensors for far too short a time period.

      No autopilot system on the road has been certified for this, even though BMW and Tesla and Volvo offer autopilot systems.

      Which makes it hard to believe that fully autonomous driverless cars are going to fare well in this situation either.

      People need only watch a few russian dash cam videos on YouTube to see just how frequently this happens.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 3, Insightful) by Nerdfest on Wednesday September 13 2017, @09:36PM

        by Nerdfest (80) on Wednesday September 13 2017, @09:36PM (#567484)

        People suck just as bad at that scenario though, maybe worse.

      • (Score: 3, Insightful) by BasilBrush on Wednesday September 13 2017, @10:09PM (5 children)

        by BasilBrush (3994) on Wednesday September 13 2017, @10:09PM (#567502)

        The more autonomous systems on the road, the less red lights will be jumped.

        --
        Hurrah! Quoting works now!
        • (Score: 3, Interesting) by frojack on Thursday September 14 2017, @12:45AM (4 children)

          by frojack (1554) on Thursday September 14 2017, @12:45AM (#567545) Journal

          Maybe. Maybe not. Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do. Remember, to error is human. To really fuck things up you need a computer.

          I can see improvement in this area when all cars communicate with each other indicating speed, location, and direction.
          Tinfoil sales are likely to hit boom times. But in-dash warnings of unsafe crossing conditions (even when light is green) will provide safety benefits to human drivers as well as autonomous cars.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 2) by BasilBrush on Thursday September 14 2017, @01:44AM (1 child)

            by BasilBrush (3994) on Thursday September 14 2017, @01:44AM (#567570)

            Because the number one objective is safety.

            --
            Hurrah! Quoting works now!
            • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @08:16AM

              by Anonymous Coward on Thursday September 14 2017, @08:16AM (#567694)

              It the number one objective is safety, they wouldn't have built a broken system that requires humans do the one thing that humans are especially bad at, and computers are really good at: Just sitting there, monitoring traffic, ready to take over at short notice.

          • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @04:02AM

            by Anonymous Coward on Thursday September 14 2017, @04:02AM (#567628)

            For starters, there is no AI.

          • (Score: 2) by DeathMonkey on Thursday September 14 2017, @07:46PM

            by DeathMonkey (1380) on Thursday September 14 2017, @07:46PM (#568039) Journal

            Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do.

            Because auto manufacturers aren't stupid. Well...THAT stupid anyway!