Federal investigators announced Tuesday that the design of Tesla's semiautonomous driving system allowed the driver of a Tesla Model S in a fatal 2016 crash with a semi-truck to rely too heavily on the car's automation.
"Tesla allowed the driver to use the system outside of the environment for which it was designed," said National Transportation Safety Board Chairman Robert Sumwalt. "The system gave far too much leeway to the driver to divert his attention."
The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation — or Autopilot, as Tesla calls the system. Tesla's system design was declared a contributing factor.
[...] A Tesla spokesperson provided a statement to ABC News that read, "We appreciate the NTSB's analysis of last year's tragic accident, and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."
According to The Associated Press, members of Brown's family said on Monday that they do not blame the car or the Autopilot system for his death.
A National Highway Traffic Safety Administration report on the crash can be found here. The NTSB has yet not published its full report; a synopsis of it can be found here.
(Score: 5, Insightful) by n1 on Wednesday September 13 2017, @08:36PM (9 children)
I agree with your sentiment, but Tesla and Musk have been very keen to promote Autopilot capabilities beyond what is really appropriate, in my opinion.
When this crash happened, Tesla website was promoting autopilot as 'automatic steering, speed, lane changing and parking'
Around the same time elsewhere...
More recently Elon musk boasted [thestreet.com] on Twitter about watching the eclipse through the glass roof on a Model S whilst using Autopilot.
Musk told reporters that the Model S was “probably better than humans at this point in highway driving”. Before the updated autopilot was released, he said that the car was “almost able to go [between San Francisco and Seattle] without touching the controls at all”.
Talulah Riley, Musk’s [ex]wife, shared and deleted an Instagram video of herself driving on the highway between Los Angeles and San Diego without holding the wheel. [theguardian.com]
This PR vs Terms & Conditions gets worse when theres the ever present promotion of next week your car could get an update that makes autopilot even better/worse than it was and the intended use of the system changes while you sleep.
(Score: 3, Touché) by DannyB on Wednesday September 13 2017, @09:02PM (8 children)
Yep. Autopilot may indeed be better than some humans at highway driving. But that doesn't make it a replacement for paying attention to the road and keeping your hand on the steering wheel while the driver is sending tweets.
To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
(Score: 4, Insightful) by frojack on Wednesday September 13 2017, @09:22PM (7 children)
The Straight Crossing Path (see page 3 of NHTSA report [nhtsa.gov] such as a car running a red light is almost impossible for any autonomous driving software to detect, especially if includes multiple lines of stopped traffic, or trees, or similar sight line issues. The crossing car is in the sight lines of the sensors for far too short a time period.
No autopilot system on the road has been certified for this, even though BMW and Tesla and Volvo offer autopilot systems.
Which makes it hard to believe that fully autonomous driverless cars are going to fare well in this situation either.
People need only watch a few russian dash cam videos on YouTube to see just how frequently this happens.
No, you are mistaken. I've always had this sig.
(Score: 3, Insightful) by Nerdfest on Wednesday September 13 2017, @09:36PM
People suck just as bad at that scenario though, maybe worse.
(Score: 3, Insightful) by BasilBrush on Wednesday September 13 2017, @10:09PM (5 children)
The more autonomous systems on the road, the less red lights will be jumped.
Hurrah! Quoting works now!
(Score: 3, Interesting) by frojack on Thursday September 14 2017, @12:45AM (4 children)
Maybe. Maybe not. Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do. Remember, to error is human. To really fuck things up you need a computer.
I can see improvement in this area when all cars communicate with each other indicating speed, location, and direction.
Tinfoil sales are likely to hit boom times. But in-dash warnings of unsafe crossing conditions (even when light is green) will provide safety benefits to human drivers as well as autonomous cars.
No, you are mistaken. I've always had this sig.
(Score: 2) by BasilBrush on Thursday September 14 2017, @01:44AM (1 child)
Because the number one objective is safety.
Hurrah! Quoting works now!
(Score: 0) by Anonymous Coward on Thursday September 14 2017, @08:16AM
It the number one objective is safety, they wouldn't have built a broken system that requires humans do the one thing that humans are especially bad at, and computers are really good at: Just sitting there, monitoring traffic, ready to take over at short notice.
(Score: 0) by Anonymous Coward on Thursday September 14 2017, @04:02AM
For starters, there is no AI.
(Score: 2) by DeathMonkey on Thursday September 14 2017, @07:46PM
Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do.
Because auto manufacturers aren't stupid. Well...THAT stupid anyway!