Slash Boxes

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 12 submissions in the queue.
posted by janrinok on Monday April 02 2018, @01:27PM   Printer-friendly
from the I'll-wait-until-the-bugs-are-ironed-out dept.

Tesla Model X driver dies in Mountain View crash

Submitted via IRC for Fnord666

The driver of a Tesla Model X has died following a highway crash in Mountain View, leaving a number of safety questions.


Tesla Crash: Model X Was In Autopilot Mode, Firm Says

In a post on its website, the electric-car maker said computer logs retrieved from the wrecked SUV show that Tesla's driver-assisting Autopilot technology was engaged and that the driver doesn't appear to have grabbed the steering wheel in the seconds before the crash.

The car's 38-year-old driver died after the vehicle hit a concrete lane divider on a Northern California freeway and caught fire. The accident happened March 23.

[...] In its Friday post, Tesla said the crashed Model X's computer logs show that the driver's hands weren't detected on the steering wheel for 6 seconds prior to the accident. It said they also show the driver had "about five seconds and 150 meters of unobstructed view of the concrete divider" before the crash but that "no action was taken."

The company cited various statistics in defending Autopilot in the post and said there's no doubt the technology makes vehicles safer than traditional cars.

"Over a year ago," the post said, "our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent. Internal data confirms that recent updates to Autopilot have improved system reliability."

"Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur," the post reads. "It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."

Original Submission #1Original Submission #2

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Nuke on Monday April 02 2018, @10:10PM (2 children)

    by Nuke (3162) on Monday April 02 2018, @10:10PM (#661693)

    The Electrek link video tells me all I need to know about SD cars at the present time. If they can be fooled by so elementary a situation then they have a long long way to go. If they can fuck up on a wide, open, well lit, signed and marked bit of road like that, I hate to think how they would manage on the roads around me in a rural part of the UK for example.

    I wonder when the shills will stop claiming they are safer than an average human driver, unless the drivers where they live are very bad indeed. Nevertheless the guys seen driving ahead in that video managed to pass that point without killing themselves. Whatever the crash statistics of SD cars are, they are not as good as mine because they have crashed and I have not. Small sample, but SD cars so far are themselves a small sample, and mostly with test drivers aboard being more alert than the "average" driver would be. Wait until the latter start using them. Cases like this show what happens when the driver is not ready to intervene all the time and so he needs to be just as keyed up and constantly making decisions as if driving himself anyway, leaving us wondering what the point is.

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 3, Interesting) by LoRdTAW on Monday April 02 2018, @11:55PM

    by LoRdTAW (3755) on Monday April 02 2018, @11:55PM (#661724) Journal

    I personally believe the so-called shills for antonymous cars are people who have no real understanding of computers and how incredibly complex these problems are to solve but pretend to understand. Sure we have computer vision and there are cool demos but imagine all of those demos running at once and algorithms deciding whats a person, the road, a sign, and an near infinite amount of objects and patterns.

    Putting these barely tested time bombs on the road is another great demonstration of mans hubris with regard to technological advancement. Time to admit we DO NOT have safe antonymous vehicles on the road. We have a lot more work to do to prove otherwise.

  • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @08:02AM

    by Anonymous Coward on Tuesday April 03 2018, @08:02AM (#661856)

    and mostly with test drivers aboard being more alert than the "average" driver would be

    Even the Uber test driver in the car that killed a pedestrian recently looked up from the phone about as often as alert as an average cell phone using driver in a *non self driving* car.

    Just wait until they start putting people who are used to texting and driving into self driving cars... They won't be looking up in time to realize they are about to hit someone and it's too late to brake, like in that Uber video. They will be looking in the mirror thinking "what was that bump".