Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday January 22 2017, @02:14PM   Printer-friendly
from the not-the-NTSB dept.

Last Thursday the National Highway Traffic Safety Administration delivered the results of its investigation of the 2016 crash of Joshua Brown while he was driving a Tesla with Autopilot software.

"A safety-related defect trend has not been identified at this time and further examination of this issue does not appear to be warranted," NHTSA's investigators found. In other words: Tesla didn't cause Brown's death.

The verdict should relieve not just the electric car builder, but the industry at large. Semi-autonomous and driver assistance technologies are more than a fresh source of revenue for automakers. They're the most promising way to cut into the more than 30,000 traffic deaths on US roads every year. Today's systems aren't perfect. They demand human oversight and can't handle everything the real world throws at them. But they're already saving lives.

NHTSA's goal wasn't to find the exact cause of the crash (that's up to the National Transportation Safety Board, which is running its own inquiry), but to root out any problems or defects with Tesla's Autopilot hardware and software.

The content of the investigation report is available from the NHTSA web site.

[Editor's note: Recently the link to the report has been returning an error occasionally. As an alternative, the Google webcache of the page is available as is a copy of the report at archive.org .]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by VLM on Sunday January 22 2017, @05:03PM

    by VLM (445) on Sunday January 22 2017, @05:03PM (#457368)

    Autopilot

    Control the language, control the narrative. Its not an autopilot.

    Autopilots come from aircraft and to a lesser extent ships and they're a PID algo that tries to get the vehicle to move in a specified direction or to a point. They don't pilot which makes the name even more controversial, given the GPS coords of a skyscraper they'll happily fly into it, or a boat will cheerfully follow a course that happens to run over waterskiers and swimmers or go right up the beach.

    The driving emulators are trying to do far more than autopilot, and they generally fail enough that you can't rely on them.

    Its very weird because all forms of labor except life critical real time software engineers are getting cheaper, so purchasing pilot labor which is already cheap is only going to get cheaper. Ditto taxi driver or harbor pilot for a boat. Meanwhile people who should know better are wasting inordinate amounts of expensive development money on replacing ever cheaper labor. That can't possibly end well.

    Financial crashes have been bubble caused for a long time, the connection is tenuous but the upcoming crash is more of a "stupid bubble" than the traditional financial bubble manipulation. You could argue bubbles lead to cheap cash for corporates lead to dumb R+D support lead to financial crashes, maybe.

    In the real world, you start with "technicians" or "mechanics" pumping gas into cars while trying to sell services. After minimum wage increases and money saving efforts people get stuck pumping their own gas. There's no reason the whole thing couldn't be automated with a robot arm other than progress never moves that way, you never see an expense pawned off on the customers then come back to the provider as a huge cost center in order to make life more convenient for the customer. We're not going to get robot waiters because we already have self service, as another example.

    Starting Score:    1  point
    Moderation   +3  
       Interesting=3, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5