Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday January 22 2017, @02:14PM   Printer-friendly
from the not-the-NTSB dept.

Last Thursday the National Highway Traffic Safety Administration delivered the results of its investigation of the 2016 crash of Joshua Brown while he was driving a Tesla with Autopilot software.

"A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted," NHTSA's investigators found. In other words: Tesla didn't cause Brown's death.

The verdict should relieve not just the electric car builder, but the industry at large. Semi-autonomous and driver assistance technologies are more than a fresh source of revenue for automakers. They're the most promising way to cut into the more than 30,000 traffic deaths on US roads every year. Today's systems aren't perfect. They demand human oversight and can't handle everything the real world throws at them. But they're already saving lives.

NHTSA's goal wasn't to find the exact cause of the crash (that's up to the National Transportation Safety Board, which is running its own inquiry), but to root out any problems or defects with Tesla's Autopilot hardware and software.

The content of the investigation report is available from the NHTSA web site.

[Editor's note: Recently the link to the report has been returning an error occasionally. As an alternative, the Google webcache of the page is available as is a copy of the report at archive.org .]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by ledow on Sunday January 22 2017, @04:27PM

    by ledow (5567) on Sunday January 22 2017, @04:27PM (#457357) Homepage

    Wrong question.

    The Tesla software didn't CAUSE the crash.

    But neither did it do anything to prevent it, despite any normal driver being aware of the problem for 7 seconds and having no mechanical faults etc. preventing the car from just coming to a controlled stop.

    Something sold as a driver assist is a bit crap if it doesn't do that.

    Ultimately, under the law, the driver is at fault WHETHER OR NOT Tesla are at fault. He was driving without due care and attention (is this the one where the truck driver says that the guy was watching Harry Potter movies in the car when they went to pull him out?).

    But the Tesla software didn't detect, react, or do anything about the collision. Which begs the question - do you REALLY want to trust it, after you paid THOUSANDS for it, even one little bit if it can't spot a lorry turning across your path in 7 seconds?

    It's not that the Tesla software killed the guy. It's that it did nothing to save him. Despite being its entire purpose. It's like an airbag that doesn't go off in the same kind of accident. You haven't killed him directly, that was the impact that did that. But what's the point of being there, especially as an optional extra, if the LARGEST and MOST OBVIOUS hazard is entirely missed?

    Starting Score:    1  point
    Moderation   +4  
       Interesting=4, Total=4
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5