Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday July 01 2016, @01:49AM   Printer-friendly
from the Pay-attention! dept.

Two Soylentils wrote in with news of a fatal accident involving a Tesla vehicle. Please note that the feature in use, called "Autopilot" is not the same as an autonomous vehicle. It provides lane-keeping, cruise control, and safe-distance monitoring, but the driver is expected to be alert and in control at all times. -Ed.

Man Killed in Crash of 'Self-Driving' Car

Tech Insider reports that an Ohio man was killed on 7 May when his Tesla Model S, with its autopilot feature turned on, went under a tractor-trailer.

Further information:

Tesla Autopilot - Fatal Accident

http://www.cnbc.com/2016/06/30/us-regulators-investigating-tesla-over-use-of-automated-system-linked-to-fatal-crash.html

Accident is reported to have happened in May, and reported to NHTSA/DOT immediately by Tesla. But not public until the end of June -- something a bit fishy about this reporting lag.

On the other hand, the accident is described as one that might have also been difficult for an alert human to have avoided:

The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving. The Model S passed under the tractor trailer, and the bottom of the trailer hit the Tesla vehicle's windshield.

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

This was the first reporting found--by the time it makes the SN front page there may be more details. Because this is a "first" it seems likely that a detailed investigation and accident reconstruction will be performed.


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by RedBear on Friday July 01 2016, @03:40AM

    by RedBear (1734) Subscriber Badge on Friday July 01 2016, @03:40AM (#368257)

    Oh boy, do I have a problem with this part of the quotes:

    the accident is described as one that might have also been difficult for an alert human to have avoided:

    The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving. The Model S passed under the tractor trailer, and the bottom of the trailer hit the Tesla vehicle's windshield.
    "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

    There is no possible way that Tesla could know whether or not the driver noticed the trailer. It may be argued to be a reasonable assumption, but it still a very theoretical assumption. All they know for certain is that the driver did not apply the brake manually. Another assumption could easily be that the driver trusted the autopilot way too much and failed to apply the brake because he EXPECTED the car to do it, and when one is engaged in allowing autopilot to drive one's vehicle there is a tendency to be unsure of whether you need to take over driving or not. You have to actively disengage your brain from the driving task in order to negate your own automatic motor-memory driving actions from interfering with the autopilot, and putting yourself back in "driver mode" can easily take a second or two, which is many car lengths at highway speeds. Anyone who hasn't spent time practicing instantaneously taking over from the autopilot can easily have a brain fart that simply takes too much time to react, because until the autopilot goes "beep beep" and explicitly hands control back to the driver, the driver will have a tendency to believe the autopilot is operating normally and everything should be fine. This problem will only get worse as more people put more trust in autopilot systems. This is why I believe we shouldn't even be experimenting with autopilot systems yet on public roads. We should be sticking to only things like emergency braking systems and lane keeping alarms and other things that help keep the attentive human driver from making fatal mistakes rather than relying on human drivers who aren't even paying attention to keep the artificial autopilots from making fatal mistakes. We have things totally backwards in my opinion. The human driver should never be encouraged to take their attention off the road or their hands off the wheel as long as they are sitting in the driver's seat, but that's exactly what these autopilot systems are encouraging. I just saw a GIF yesterday of a Tesla driver asleep behind the wheel while his car drove him down the highway. That is what autopilot systems are enabling.

    I like Tesla, but I find the above quote to be very self-serving. Whether he was looking in a different direction at that moment instead of looking straight ahead at the road, or maybe he simply hesitated a moment too long due to misfiring neurons keeping him from attempting to interfere with what he expected the autopilot to do in that moment, it is easily possible that the driver's excessive trust in the autopilot could have been the primary thing that got him killed. But expressing that possible assumption would be extremely detrimental to Tesla, even if it was just expressed as a remote possibility.

    --
    ¯\_ʕ◔.◔ʔ_/¯ LOL. I dunno. I'm just a bear.
    ... Peace out. Got bear stuff to do. 彡ʕ⌐■.■ʔ
    Starting Score:    1  point
    Moderation   +3  
       Insightful=3, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 0) by Anonymous Coward on Friday July 01 2016, @01:35PM

    by Anonymous Coward on Friday July 01 2016, @01:35PM (#368388)

    According to some witnesses, the driver had Harry Potter on... so I think I can guess he did not fuckin notice the trailer, but you are right it is not guaranteed.