Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday April 02 2018, @01:27PM   Printer-friendly
from the I'll-wait-until-the-bugs-are-ironed-out dept.

Tesla Model X driver dies in Mountain View crash

Submitted via IRC for Fnord666

The driver of a Tesla Model X has died following a highway crash in Mountain View, leaving a number of safety questions.

Source: https://www.engadget.com/2018/03/24/tesla-model-x-driver-dies-in-mountain-view-crash/

Tesla Crash: Model X Was In Autopilot Mode, Firm Says

In a post on its website, the electric-car maker said computer logs retrieved from the wrecked SUV show that Tesla's driver-assisting Autopilot technology was engaged and that the driver doesn't appear to have grabbed the steering wheel in the seconds before the crash.

The car's 38-year-old driver died after the vehicle hit a concrete lane divider on a Northern California freeway and caught fire. The accident happened March 23.

[...] In its Friday post, Tesla said the crashed Model X's computer logs show that the driver's hands weren't detected on the steering wheel for 6 seconds prior to the accident. It said they also show the driver had "about five seconds and 150 meters of unobstructed view of the concrete divider" before the crash but that "no action was taken."

The company cited various statistics in defending Autopilot in the post and said there's no doubt the technology makes vehicles safer than traditional cars.

"Over a year ago," the post said, "our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent. Internal data confirms that recent updates to Autopilot have improved system reliability."

"Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur," the post reads. "It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by LoRdTAW on Monday April 02 2018, @01:58PM (6 children)

    by LoRdTAW (3755) on Monday April 02 2018, @01:58PM (#661452) Journal

    And this genius provides a demonstration:
    https://electrek.co/2018/04/02/tesla-fatal-autopilot-crash-recreation/ [electrek.co]

    Starting Score:    1  point
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 1, Interesting) by Anonymous Coward on Monday April 02 2018, @02:35PM (2 children)

    by Anonymous Coward on Monday April 02 2018, @02:35PM (#661480)

    Thanks for that electrek link, it's about what I expected given the clues in various other reports. On the Tesla page it mentions the crushed crash attenuator, crushed previous to this accident(??) In that case, will the highway dept share liability for not fixing it promptly? The attenuator might be a stack of the common yellow barrels (Fitch barrier--invented by WWII fighter pilot and race car driver John Fitch) or some other design. The Fitch design is clever, the ballast (sand or water) is held up off the ground, so that the center of mass of the barrier is similar height to CG height of a car.

    Is this scenerio similar to the Tesla that ran into the stopped emergency vehicle not long ago? Something stopped in the lane seems to be filtered out or ignored by the video system.

    • (Score: 2, Informative) by tftp on Monday April 02 2018, @09:12PM (1 child)

      by tftp (806) on Monday April 02 2018, @09:12PM (#661664) Homepage

      No, in that case it was a different bug. Autopilot was following a car, but then the leading car left the lane (since its driver saw the fire truck ahead.) The autopilot detected that there is no car ahead anymore and accelerated the tesla to the preset speed, with which it proceeded to hit the fire truck.

      The warning about this behavior was prominently printed on the page 358 of the manual, second paragraph from the bottom. The driver is, of course, expected to know all the listed bugs and be ready to counter them at any time.

      If you ask me, I'd rather drive the car myself, it is less tiring and much safer. Can't imagine how it is to ride in a seemingly friendly car that loves to kill you as soon as it sees an opportunity.

      • (Score: 2) by LoRdTAW on Monday April 02 2018, @11:43PM

        by LoRdTAW (3755) on Monday April 02 2018, @11:43PM (#661722) Journal

        I wonder if there is a way to permanently disable it in case one of those bugs includes self-awareness, or at the very least, malfunction.

  • (Score: 3, Insightful) by Nuke on Monday April 02 2018, @10:10PM (2 children)

    by Nuke (3162) on Monday April 02 2018, @10:10PM (#661693)

    The Electrek link video tells me all I need to know about SD cars at the present time. If they can be fooled by so elementary a situation then they have a long long way to go. If they can fuck up on a wide, open, well lit, signed and marked bit of road like that, I hate to think how they would manage on the roads around me in a rural part of the UK for example.

    I wonder when the shills will stop claiming they are safer than an average human driver, unless the drivers where they live are very bad indeed. Nevertheless the guys seen driving ahead in that video managed to pass that point without killing themselves. Whatever the crash statistics of SD cars are, they are not as good as mine because they have crashed and I have not. Small sample, but SD cars so far are themselves a small sample, and mostly with test drivers aboard being more alert than the "average" driver would be. Wait until the latter start using them. Cases like this show what happens when the driver is not ready to intervene all the time and so he needs to be just as keyed up and constantly making decisions as if driving himself anyway, leaving us wondering what the point is.

    • (Score: 3, Interesting) by LoRdTAW on Monday April 02 2018, @11:55PM

      by LoRdTAW (3755) on Monday April 02 2018, @11:55PM (#661724) Journal

      I personally believe the so-called shills for antonymous cars are people who have no real understanding of computers and how incredibly complex these problems are to solve but pretend to understand. Sure we have computer vision and there are cool demos but imagine all of those demos running at once and algorithms deciding whats a person, the road, a sign, and an near infinite amount of objects and patterns.

      Putting these barely tested time bombs on the road is another great demonstration of mans hubris with regard to technological advancement. Time to admit we DO NOT have safe antonymous vehicles on the road. We have a lot more work to do to prove otherwise.

    • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @08:02AM

      by Anonymous Coward on Tuesday April 03 2018, @08:02AM (#661856)

      and mostly with test drivers aboard being more alert than the "average" driver would be

      Even the Uber test driver in the car that killed a pedestrian recently looked up from the phone about as often as alert as an average cell phone using driver in a *non self driving* car.

      Just wait until they start putting people who are used to texting and driving into self driving cars... They won't be looking up in time to realize they are about to hit someone and it's too late to brake, like in that Uber video. They will be looking in the mirror thinking "what was that bump".