Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Thursday June 14 2018, @02:37PM   Printer-friendly
from the crash-and-burn dept.

Submitted via IRC for Runaway1956

Tesla fatal crash: 'autopilot' mode sped up car before driver killed, report finds

A Tesla driving in "autopilot" mode crashed in March when the vehicle sped up and steered into a concrete barrier, according to a new report on the fatal collision, raising fresh concerns about Elon Musk's technology.

The National Transportation Safety Board (NTSB) said that four seconds before the 23 March crash on a highway in Silicon Valley, which killed Walter Huang, 38, the car stopped following the path of a vehicle in front of it. Three seconds before the impact, it sped up from 62mph to 70.8mph, and the car did not brake or steer away, the NTSB said.

[...] The NTSB report [...] has once again raised serious safety questions about the limits and performance of the autopilot technology, which is meant to assist drivers and has faced growing scrutiny from experts and regulators. Mark Fong, an attorney for Huang's family, also said the report appeared to "contradict Tesla's characterization" of the collision.

The NTSB press release includes this link to the preliminary report, for anyone inclined to read the slightly longer version of events.

The Mountain View Fire Department applied about 200 gallons of water and foam to extinguish the post-crash fire. The battery reignited five days after the crash in an impound lot and was extinguished by the San Mateo Fire Department.

Layoffs at Tesla

Tesla Lays Off 9 Percent Of Workforce

Tesla will lay off about 3,500 workers in an effort to boost profitability, CEO Elon Musk wrote in a company email.

"What drives us is our mission to accelerate the world's transition to sustainable, clean energy, but we will never achieve that mission unless we eventually demonstrate that we can be sustainably profitable," Musk wrote.

Musk conceded that Tesla has not made an annual profit in 15 years. The company posted its largest quarterly loss, of more than $700 million, earlier this year.


Original Submission #1Original Submission #2Original Submission #3

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by tftp on Thursday June 14 2018, @04:29PM (3 children)

    by tftp (806) on Thursday June 14 2018, @04:29PM (#692992) Homepage

    Sure. But Tesla acts like a suicide driver - it accelerates toward unknown. Tesla has no LIDAR, its autopilot is driven by a few cameras [electrek.co]. This is all by design. The stop instead of acceleration would be safer (and quickly taught the drivers the limitations of the system) - but no, Tesla cars are made to be dangerous. And they don't stop in front of an obstacle - though this is the most valuable function of any automation. They just cannot figure it out because all they have is a single-camera 2D view. They have to have LIDAR to say what is a shadow, what is an obstacle.

    In other words, the complaint is not against self-driving cars or automatic lane followers, but against bad, deadly implementations of them. Tesla is the world leader in causing accidents where a human driver would have sailed through without a worry. The solution here is not to clench the teeth and sacrifice ourselves so that the coders could review the crash and patch their software, but to demand conformance to strict standards of behavior - in other words, to save the driver instead of killing it. Those standards have to be worked out, testing centers constructed - otherwise car manufacturers will be testing on the public streets. We know how safe that is (see Uber.) The standards will demand that a car must not be equipped with an aid that lures the drivers into danger. Tesla demands hands on wheel - but does nothing. Why not to ring all the bells and gradually slow the car? The driver will quickly learn the safe practices. But no, it's done so you can sleep behind the wheel, and the car still flies ahead.

  • (Score: 4, Interesting) by NewNic on Thursday June 14 2018, @05:59PM (2 children)

    by NewNic (6420) on Thursday June 14 2018, @05:59PM (#693051) Journal

    While everyone focuses on Tesla, it's important to note this snippet from the report:
    "The crash attenuator was an SCI smart cushion attenuator system, which was previously damaged on March 12, 2018, in a single-vehicle crash involving a 2010 Toyota Prius (see figure 3). "

    The safety device on the road was damaged and not functioning following a prior accident.

    --
    lib·er·tar·i·an·ism ˌlibərˈterēənizəm/ noun: Magical thinking that useful idiots mistake for serious political theory
    • (Score: 0) by Anonymous Coward on Thursday June 14 2018, @08:05PM (1 child)

      by Anonymous Coward on Thursday June 14 2018, @08:05PM (#693146)
      The thing I find interesting about this specific bit of data is: where is all the outrage about the Prius driver that did the same thing as the Tesla? They both ran into stationary objects. So in this one case, it simply shows that Tesla's flavor of self-driving is no less dangerous than a human. After all, a human drove into the barrier just a few weeks prior to the Tesla.
      • (Score: 0) by Anonymous Coward on Thursday June 14 2018, @08:41PM

        by Anonymous Coward on Thursday June 14 2018, @08:41PM (#693162)

        > After all, a human drove into the barrier just a few weeks prior to the Tesla.

        Before that human/Prius hit the barrier, how many hundreds of thousands of human-driven cars passed that point safely? And how many more after the Prius accident, in the weeks before the Tesla cleaned it out? Don't know, but most of them did. Is there any data on the Prius driver, were they impaired or distracted? Did they just graze the barrier enough to damage it, or did they auger in?

        Now -- how many Tesla cars on Autopilot passed that point safely? I don't know, but many, many times less, maybe only this one? It was an area with merging and I'll wager that at least some Tesla drivers would have their hands on the wheel and their eyes on the road at a critical point like this.