Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday November 04 2018, @07:04AM   Printer-friendly
from the shouldn't-it-be-auto-driver? dept.

Submitted via IRC for Bytram

Another Tesla with Autopilot crashed into a stationary object—the driver is suing

Earlier this month, Shawn Hudson's Tesla Model S crashed into a stalled car while moving at about 80 miles per hour on a Florida freeway. Tesla's Autopilot technology was engaged at the time, and Hudson has now filed a lawsuit against Tesla in state courts.

"Through a pervasive national marketing campaign and a purposefully manipulative sales pitch, Tesla has duped consumers" into believing that Autpilot can "transport passengers at highway speeds with minimal input and oversight," the lawsuit says.

Hudson had a two-hour commute to his job at an auto dealership. He says that he heard about Tesla's Autopilot technology last year and went to a Tesla dealership to learn more.

"Tesla's sales representative reassured Hudson that all he needed to do as the driver of the vehicle is to occasionally place his hand on the steering wheel and that the vehicle would 'do everything else,'" the lawsuit claims.

Tesla blames driver in last month's fatal crash with Autopilot engaged

But that description of Tesla's Autopilot system is not true. While the system can handle a range of driving conditions, it's not designed to stop for parked cars or other stationary objects when traveling at highway speeds. This year, at least two other Tesla drivers have plowed into parked vehicles while their cars were in Autopilot mode (one of them sued Tesla last month). Another Tesla customer, Californian Walter Huang, was killed when his Tesla vehicle ran into a concrete lane divider at full speed.

"It is the driver's responsibility to remain attentive to their surroundings and in control of the vehicle at all times," a Tesla spokesman told Ars by email. "Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner's Manual and Release Notes for software updates." (I've reproduced Tesla's full emailed statement at the end of the story.)


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by Immerman on Sunday November 04 2018, @06:23PM

    by Immerman (3985) on Sunday November 04 2018, @06:23PM (#757688)

    I partially agree - it makes sense to give the autopilot access to all the data. However, the entire point of having a backup safety system (autobraking or whatever) is to prevent a failure by the driver (in this case autopilot) from killing you - to do that, it must NOT allow the autopilot to override. If the emergency braking system freaks out and thinks it needs to brake, the car brakes. If the autopilot thinks it's wrong, then it needs to navigate the car in a manner that avoids triggering the emergency braking system.

    Otherwise having the emergency braking system at all is pointless - it sees an oncoming collision and starts to brake, milliseconds later the imperfect autopilot overrules it, and you end up slamming into the parked car anyway.

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3