Federal investigators announced Tuesday that the design of Tesla's semiautonomous driving system allowed the driver of a Tesla Model S in a fatal 2016 crash with a semi-truck to rely too heavily on the car's automation.
"Tesla allowed the driver to use the system outside of the environment for which it was designed," said National Transportation Safety Board Chairman Robert Sumwalt. "The system gave far too much leeway to the driver to divert his attention."
The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation — or Autopilot, as Tesla calls the system. Tesla's system design was declared a contributing factor.
[...] A Tesla spokesperson provided a statement to ABC News that read, "We appreciate the NTSB's analysis of last year's tragic accident, and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."
According to The Associated Press, members of Brown's family said on Monday that they do not blame the car or the Autopilot system for his death.
A National Highway Traffic Safety Administration report on the crash can be found here. The NTSB has yet not published its full report; a synopsis of it can be found here.
(Score: 3, Insightful) by BasilBrush on Wednesday September 13 2017, @10:09PM (5 children)
The more autonomous systems on the road, the less red lights will be jumped.
Hurrah! Quoting works now!
(Score: 3, Interesting) by frojack on Thursday September 14 2017, @12:45AM (4 children)
Maybe. Maybe not. Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do. Remember, to error is human. To really fuck things up you need a computer.
I can see improvement in this area when all cars communicate with each other indicating speed, location, and direction.
Tinfoil sales are likely to hit boom times. But in-dash warnings of unsafe crossing conditions (even when light is green) will provide safety benefits to human drivers as well as autonomous cars.
No, you are mistaken. I've always had this sig.
(Score: 2) by BasilBrush on Thursday September 14 2017, @01:44AM (1 child)
Because the number one objective is safety.
Hurrah! Quoting works now!
(Score: 0) by Anonymous Coward on Thursday September 14 2017, @08:16AM
It the number one objective is safety, they wouldn't have built a broken system that requires humans do the one thing that humans are especially bad at, and computers are really good at: Just sitting there, monitoring traffic, ready to take over at short notice.
(Score: 0) by Anonymous Coward on Thursday September 14 2017, @04:02AM
For starters, there is no AI.
(Score: 2) by DeathMonkey on Thursday September 14 2017, @07:46PM
Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do.
Because auto manufacturers aren't stupid. Well...THAT stupid anyway!