Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday June 21 2017, @09:20AM   Printer-friendly
from the More-of-a-passenger-than-a-driver dept.

The first fatality involving Tesla's Autopilot feature led to questions over the safety of the semi-autonomous system last year, but a report published by the National Transportation Safety Board (NTSB) concludes that Elon Musk's company was not at fault. While the cause of the crash has still not been determined, the 538-page report states that driver Joshua Brown had his hands off the wheel of the Tesla Model S "for the vast majority of the trip." This was despite receiving seven visual warnings, six of which also sounded a chime, to maintain control during the 37-minute journey.

GreenCar Reports states:

The truck driver involved in the crash also claimed Brown was watching a movie at the time of impact—an aftermarket DVD player was found among the wreckage.

On the other hand, Ars Technica reports otherwise:

In the latest regulatory documents on the incident, the National Traffic Safety Board disputed some accounts that Brown was watching a Harry Potter movie during the crash last year. The board said it found several electronic devices, but there was no evidence that they were being operated during the accident.

Ars elaborates on the amount of time that the driver had his hands on the wheel:

Tesla's autopilot mode allows a vehicle to maintain the speed of traffic, and an auto-steer function is designed to help keep the Tesla inside its lane. The board said the Tesla alerted the driver seven times with a visual of "Hands Required Not Detected." The authorities said the motorist, a former Navy Seal, had his hands on the wheel for 25 seconds during the 37 minutes of the trip when they should have been placed on the steering wheel. That's according to "system performance data" from Tesla, the government said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by fustakrakich on Wednesday June 21 2017, @11:31AM (6 children)

    by fustakrakich (6150) on Wednesday June 21 2017, @11:31AM (#528956) Journal

    Well, it wasn't exactly 'faultless'. The car could have slowed down and/or stopped when the driver didn't heed the warnings.

    --
    La politica e i criminali sono la stessa cosa..
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Informative=1, Overrated=1, Total=3
    Extra 'Informative' Modifier   0  

    Total Score:   2  
  • (Score: 2) by wonkey_monkey on Wednesday June 21 2017, @03:30PM (3 children)

    by wonkey_monkey (279) on Wednesday June 21 2017, @03:30PM (#529057) Homepage

    That's far more dangerous than simply continuing to drive. Not, perhaps, as dangerous as driving into a truck, but we don't know exactly why that happened yet.

    --
    systemd is Roko's Basilisk
    • (Score: 4, Insightful) by vux984 on Wednesday June 21 2017, @03:50PM (1 child)

      by vux984 (5045) on Wednesday June 21 2017, @03:50PM (#529064)

      That's far more dangerous than simply continuing to drive.

      Nope. Suppose the driver has fallen asleep.

      That's not to say the car should pull over on a bridge or something, but common sense dictates it pull over as soon as it is somewhere safe. If it can't figure that out, its not ready to drive.

      • (Score: 2) by wonkey_monkey on Thursday June 22 2017, @09:04PM

        by wonkey_monkey (279) on Thursday June 22 2017, @09:04PM (#529668) Homepage

        Nope. Suppose the driver has fallen asleep.

        So what if he has? That's not a situation the technology, in its current state, is designed to cope with (although it could potentially still cope with it better, by continuing to drive within lane markings and keeping safe distances from other vehicles, than a dumb car can).

        If it can't figure that out, its not ready to drive.

        No, it's just not ready to have its driver fall asleep at the wheel, and/or ignore all the warnings telling them they're not using the system as safety and common sense dictates.

        --
        systemd is Roko's Basilisk
    • (Score: 1, Insightful) by Anonymous Coward on Wednesday June 21 2017, @07:25PM

      by Anonymous Coward on Wednesday June 21 2017, @07:25PM (#529158)

      Nonsense. If a car's cruise control fails (used to have a car with a wonky cruise control myself), the vehicle comes to a stop if the driver's foot isn't on the pedal.

      Everybody is taking this market babble "autopilot" and assuming that means "autonomous." Why the ever living fuck can't Tesla just give a mea culpa and admit that their marketing gave the wrong impression or what the fuck ever however weasel lawyers would put it and STOP FUCKING CALLING IT AUTOPILOT. IT'S ENHANCED FUCKING CRUISE FUCKING CONTROL WITH FUCKING LANE FUCKING ASSIST! NOTHING MORE! Fuck!

  • (Score: 2) by FatPhil on Thursday June 22 2017, @07:46AM (1 child)

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Thursday June 22 2017, @07:46AM (#529410) Homepage
    But I never said it was faultless.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves