Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday January 22 2017, @02:14PM   Printer-friendly
from the not-the-NTSB dept.

Last Thursday the National Highway Traffic Safety Administration delivered the results of its investigation of the 2016 crash of Joshua Brown while he was driving a Tesla with Autopilot software.

"A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted," NHTSA's investigators found. In other words: Tesla didn't cause Brown's death.

The verdict should relieve not just the electric car builder, but the industry at large. Semi-autonomous and driver assistance technologies are more than a fresh source of revenue for automakers. They're the most promising way to cut into the more than 30,000 traffic deaths on US roads every year. Today's systems aren't perfect. They demand human oversight and can't handle everything the real world throws at them. But they're already saving lives.

NHTSA's goal wasn't to find the exact cause of the crash (that's up to the National Transportation Safety Board, which is running its own inquiry), but to root out any problems or defects with Tesla's Autopilot hardware and software.

The content of the investigation report is available from the NHTSA web site.

[Editor's note: Recently the link to the report has been returning an error occasionally. As an alternative, the Google webcache of the page is available as is a copy of the report at archive.org .]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by Arik on Sunday January 22 2017, @04:51PM

    by Arik (4543) on Sunday January 22 2017, @04:51PM (#457366) Journal
    It's completely unworkable to have a system that runs the call 99.9% of the time but expect the human 'driver' to nonetheless pay attention and remain alert and ready to seize control back whenever there's a problem. That's simply not a model that's compatible with human psychology. Either the computer has to be able to take over to the point the 'driver' can simply fall asleep or there needs to be a full time driver with the computer only providing information rather than actually driving. If the courts understand the issue they will eventually put an end to this practice via liability.
    --
    If laughter is the best medicine, who are the best doctors?
    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4