Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday November 04 2018, @07:04AM   Printer-friendly
from the shouldn't-it-be-auto-driver? dept.

Submitted via IRC for Bytram

Another Tesla with Autopilot crashed into a stationary object—the driver is suing

Earlier this month, Shawn Hudson's Tesla Model S crashed into a stalled car while moving at about 80 miles per hour on a Florida freeway. Tesla's Autopilot technology was engaged at the time, and Hudson has now filed a lawsuit against Tesla in state courts.

"Through a pervasive national marketing campaign and a purposefully manipulative sales pitch, Tesla has duped consumers" into believing that Autpilot can "transport passengers at highway speeds with minimal input and oversight," the lawsuit says.

Hudson had a two-hour commute to his job at an auto dealership. He says that he heard about Tesla's Autopilot technology last year and went to a Tesla dealership to learn more.

"Tesla's sales representative reassured Hudson that all he needed to do as the driver of the vehicle is to occasionally place his hand on the steering wheel and that the vehicle would 'do everything else,'" the lawsuit claims.

Tesla blames driver in last month's fatal crash with Autopilot engaged

But that description of Tesla's Autopilot system is not true. While the system can handle a range of driving conditions, it's not designed to stop for parked cars or other stationary objects when traveling at highway speeds. This year, at least two other Tesla drivers have plowed into parked vehicles while their cars were in Autopilot mode (one of them sued Tesla last month). Another Tesla customer, Californian Walter Huang, was killed when his Tesla vehicle ran into a concrete lane divider at full speed.

"It is the driver's responsibility to remain attentive to their surroundings and in control of the vehicle at all times," a Tesla spokesman told Ars by email. "Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner's Manual and Release Notes for software updates." (I've reproduced Tesla's full emailed statement at the end of the story.)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Troll) by VLM on Sunday November 04 2018, @02:17PM (2 children)

    by VLM (445) on Sunday November 04 2018, @02:17PM (#757612)

    3) there's a lot of personal decision about risk taking thats asymmetric. So accident rates in the USA are high because of 2am drunks and illegal alien unlicensed driver areas and so forth, so the average risk in the USA means something, but my chosen lifestyle results in a much lower risk. With self crashing cars (don't do the PR thing and call them self driving... they're self crashing cars...) the risk is mysterious and maybe around the USA average but I'm not USA average so its WAY riskier for me.

    Its kinda like deciding to legalize asbestos because the lung cancer risk is lower than the USA average of a mixed multicultural population of 4-pack-a-day boomers and non-smokers. Sure, maybe the asbestos lung cancer risk is lower than the risk for a hypothetical average 2-pack a day smoker who doesn't exist, but that means the non-smokers should be up in arms and marching with pitchforks.

    If you don't drink and drive, under the limit or not, and if you don't use drugs, if you're not half blind, etc etc in other words if you're an average "safe" driver, then don't accept the safety stats being merely as good as the average known to be shitty driver.

    Starting Score:    1  point
    Moderation   0  
       Troll=1, Insightful=1, Total=2
    Extra 'Troll' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1, Insightful) by Anonymous Coward on Sunday November 04 2018, @03:03PM

    by Anonymous Coward on Sunday November 04 2018, @03:03PM (#757625)

    If we're going to claim that these AI cars are safer than ones driven by people, we really should be excluding those sorts of things from the statistics. It's not hard to make a car that's safer than a drunk, high, asleep or somebody who spends half their time looking at a cellphone while driving.

    We also need to consider the fact that these cars are still only acceptable-ish when driven in good conditions and as the article notes, Teslas can't handle stationary objects next to the roadway. Which is something that pretty much anybody who drives a car and isn't in one of the aforementioned groups handles routinely without any drama.

  • (Score: 2) by Nuke on Sunday November 04 2018, @04:25PM

    by Nuke (3162) on Sunday November 04 2018, @04:25PM (#757652)

    Its kinda like deciding to legalize asbestos because the lung cancer risk is lower than the USA average of a mixed multicultural population of 4-pack-a-day boomers and non-smokers.

    I think a better analogy to making us use SD cars is making everyone smoke a cigarette every day because it is safer than the national average (across smokers and non-smokers) of two per day (or whatever the figure is).