Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday June 21 2017, @09:20AM   Printer-friendly
from the More-of-a-passenger-than-a-driver dept.

The first fatality involving Tesla's Autopilot feature led to questions over the safety of the semi-autonomous system last year, but a report published by the National Transportation Safety Board (NTSB) concludes that Elon Musk's company was not at fault. While the cause of the crash has still not been determined, the 538-page report states that driver Joshua Brown had his hands off the wheel of the Tesla Model S "for the vast majority of the trip." This was despite receiving seven visual warnings, six of which also sounded a chime, to maintain control during the 37-minute journey.

GreenCar Reports states:

The truck driver involved in the crash also claimed Brown was watching a movie at the time of impact—an aftermarket DVD player was found among the wreckage.

On the other hand, Ars Technica reports otherwise:

In the latest regulatory documents on the incident, the National Traffic Safety Board disputed some accounts that Brown was watching a Harry Potter movie during the crash last year. The board said it found several electronic devices, but there was no evidence that they were being operated during the accident.

Ars elaborates on the amount of time that the driver had his hands on the wheel:

Tesla's autopilot mode allows a vehicle to maintain the speed of traffic, and an auto-steer function is designed to help keep the Tesla inside its lane. The board said the Tesla alerted the driver seven times with a visual of "Hands Required Not Detected." The authorities said the motorist, a former Navy Seal, had his hands on the wheel for 25 seconds during the 37 minutes of the trip when they should have been placed on the steering wheel. That's according to "system performance data" from Tesla, the government said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by tonyPick on Wednesday June 21 2017, @10:24AM (4 children)

    by tonyPick (1237) on Wednesday June 21 2017, @10:24AM (#528943) Homepage Journal

    At http://www.reuters.com/article/us-tesla-crash-idUSKBN19A2XC [reuters.com]

    The NTSB report disclosed that the Tesla Model S uses a proprietary system to record a vehicle's speed and other data, which authorities cannot access with the commercial tools used to access information from event data recorders in most other cars.

    For that reason, the NTSB said it "had to rely on Tesla to provide the data in engineering units using proprietary manufacturer software."

    What The Actual Fuck? How is that allowed as evidence in an investigation? If the data isn't openly accessible to the NTSB then it should be considered flat out untrustworthy at best.

    (Even if you'd trust Tesla with this kind of setup, what about if it was a Volkswagen?)

    Starting Score:    1  point
    Moderation   +2  
       Insightful=1, Interesting=1, Informative=1, Overrated=1, Total=4
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 3, Informative) by fraxinus-tree on Wednesday June 21 2017, @10:41AM

    by fraxinus-tree (5590) on Wednesday June 21 2017, @10:41AM (#528946)

    Everything is allowed as evidence. Proofs are something else, tough

  • (Score: 2) by MostCynical on Wednesday June 21 2017, @11:16AM

    by MostCynical (2589) on Wednesday June 21 2017, @11:16AM (#528952) Journal

    Are the "commercial tools", or the systems they can access, "read only", or has Tesla used something else to help prevent the wrong people getting access?
    ODBC-based "service light reset" and even "mileage adjustment" tools are sold on ebay.

    Sounds sensible to make something...different. Maybe they could make it more transparent by leasing a reader to the NTSB..

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
  • (Score: 4, Informative) by choose another one on Wednesday June 21 2017, @04:35PM (1 child)

    by choose another one (515) Subscriber Badge on Wednesday June 21 2017, @04:35PM (#529095)

    For that reason, the NTSB said it "had to rely on Tesla to provide the data in engineering units using proprietary manufacturer software."

    What The Actual Fuck? How is that allowed as evidence in an investigation? If the data isn't openly accessible to the NTSB then it should be considered flat out untrustworthy at best.

    Well for a start, every air accident investigation (I think the NTSB have experience of a few...) that involves reading a FDR (black box) is exactly the same.

    There simply isn't a universal non-proprietary "black box reader", (and sometimes the boxes are so damaged the data has to be read from individual chips). Even if there was such a thing, reading the raw data is pointless because it is meaningless without knowing the framing, parameter setup, parameter ranges, sample rates, calibrations etc. etc. (this is more than likely the actual issue here too). It can require input from the black box mfr, the aircraft mfr, and the aircraft owner/operator - and any of them may have a vested interest in nondisclosure and any of them could prevent the data being understood. And yet air-accident investigation seems to get on fine.

    When we are talking about reading "black boxes" in cars, it makes sense to learn how such things work in other industries where they have been used for decades.

    • (Score: 2) by tonyPick on Thursday June 22 2017, @07:06AM

      by tonyPick (1237) on Thursday June 22 2017, @07:06AM (#529393) Homepage Journal

      Well for a start, every air accident investigation (I think the NTSB have experience of a few...) that involves reading a FDR (black box) is exactly the same.

      Except that FDR manufacture is regulated by ICAO standards and various international regulations, plus demonstrate compliance with DO-178 in the software and the units are sealed throughout the evidence chain. (and I think the same is true of CVR's?)

      By comparison Tesla software is.... Looking for the kind of programmers who will work long hours for average pay [paysa.com]? Follow an Agile "move fast and break things" development model? [280group.com]

      And the NTSB can just trust that what the software says is correct, and that what Tesla says about what it says is correct?

      And this isn't just about Tesla: The NTSB shouldn't trust this stuff from anyone; they should be expecting to deal with the industry that gave us the ford pinto fires [howstuffworks.com].

      It can require input from the black box mfr, the aircraft mfr, and the aircraft owner/operator - and any of them may have a vested interest in nondisclosure and any of them could prevent the data being understood.

      Any of them might, but probably not *all* of them, and the systems are designed so that any one person or even a single group of people can't tamper with the recording in isolation. We have no idea how well validated the records from self driving cars are, or the extent to which they can be modified by a single person. As far as I can tell they are *literally* black boxes.