Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday January 25 2018, @02:16AM   Printer-friendly
from the stay-alert-stay-alive dept.

El Reg reports

[January 23] a Tesla Model S slammed into a stationary firetruck at around 65mph on Interstate 405 in Culver City, California. The car was driven under the fire engine, although the driver was able to walk away from the crash uninjured and refused an offer of medical treatment.

The motorist claimed the Model S was driving with Autopilot enabled when it crammed itself under the truck. Autopilot is Tesla's super-cruise-control system. It's not a fully autonomous driving system.

[...] The fire truck was parked in the carshare lane of the road with its lights flashing. None of the fire crew were hurt, although Powell noted that if his team had been in their usual position at the back of the truck then there "probably would not have been a very good outcome."

Tesla will no doubt be going over the car's computer logs to determine exactly what happened, something the California Highway Patrol will also be interested in. If this was a case of the driver sticking on Autopilot, and forgetting their responsibility to watch the road ahead it wouldn't be the first time.

In 2016, a driver was killed after both he and the Tesla systems missed a lorry pulling across the highway. A subsequent investigation by the US National Transportation Safety Board found the driver was speeding and had been warned by the car six times to keep his hands on the wheel.

Tesla has since beefed up the alerts the car will give a driver if it feels they aren't paying full attention to the road. The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.

Previous: Tesla's Semiautonomous System Contributed to Fatal Crash


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by AthanasiusKircher on Thursday January 25 2018, @07:26PM

    by AthanasiusKircher (5291) on Thursday January 25 2018, @07:26PM (#627821) Journal

    Put it this way: If we existed in some improbable alternate universe where Tesla Autopilot had been invented before manual controls, would we be sat here arguing whether putting humans behind the wheel would rightfully save some lives at the expense of many more?

    I take your point. But in the very way you just framed that, you automatically are presuming a beneficial outcome. My point wasn't just about Tesla Autopilot (which I explicitly admitted likely prevents a lot more issues than it causes), but about judging such automated technologies in general.

    For example, many people who argue about completely autonomous cars phrase it as you did in your previous post -- i.e., we just get to the point that the accident stats are as good as average stats for human drivers to view them as a good alternative. But I don't think that'd be a comfort to someone who was killed by an autonomous car acting in a completely stupid manner because the bugs weren't worked out.

    Bottom line is that there will always be side effects to the adoption of new technology, and some of those may be negative. All I'm saying is that it's rational to factor that into judging whether the tech is "better" than humans. Lots of accidents are caused by STUPID human error that is largely preventable (e.g., speeding, following too closely, etc.). I tend to be a much more cautious and conservative driver than average, so quoting average accident rates is not going to convince me to put my safety in the hands of some algorithm.

    But even if the algorithm had the stats of a "good driver," I also want to know not only that it would successfully navigate potential accident scenarios better than I would in some cases, but that it's also not going to randomly kill me by doing something completely weird and unpredictable that I, as a driver, would never do. And if such latter scenarios were more than a freak accident -- that they actually occurred with some regularity -- are you really telling me that you'd want to put your safety in the hands of such an algorithm, just based on the promise that it "performs as good as the average human driver" or even slightly better in terms of overall accident stats?

    Again, I'm not arguing that Tesla's feature isn't helpful. Only that unexpected negative outcomes should be also be a serious factor to consider, along with summary stats.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2