████ sub likely contains entire articles and possibly more, and probably needs a trimmin' ████
Submitted via IRC for Bytram
Earlier this month, Shawn Hudson's Tesla Model S crashed into a stalled car while moving at about 80 miles per hour on a Florida freeway. Tesla's Autopilot technology was engaged at the time, and Hudson has now filed a lawsuit against Tesla [clickorlando.com] in state courts.
"Through a pervasive national marketing campaign and a purposefully manipulative sales pitch, Tesla has duped consumers" into believing that Autpilot can "transport passengers at highway speeds with minimal input and oversight," the lawsuit says [arstechnica.net].
Hudson had a two-hour commute to his job at an auto dealership. He says that he heard about Tesla's Autopilot technology last year and went to a Tesla dealership to learn more.
"Tesla's sales representative reassured Hudson that all he needed to do as the driver of the vehicle is to occasionally place his hand on the steering wheel and that the vehicle would 'do everything else,'" the lawsuit claims.
Tesla blames driver in last month’s fatal crash with Autopilot engaged [arstechnica.com]
But that description of Tesla's Autopilot system is not true. While the system can handle a range of driving conditions, it's not designed to stop for parked cars or other stationary objects when traveling at highway speeds. This year, at least two other Tesla drivers have plowed into parked vehicles [arstechnica.com] while their cars were in Autopilot mode (one of them sued Tesla [nbcnews.com] last month). Another Tesla customer, Californian Walter Huang, was killed when his Tesla vehicle ran into a concrete lane divider [arstechnica.com] at full speed.
“It is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times," a Tesla spokesman told Ars by email. "Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner’s Manual and Release Notes for software updates.” (I've reproduced Tesla's full emailed statement at the end of the story.)
Hudson's crash occurred on Friday October 12. He was traveling south at about 80mph on the Florida Turnpike. Hudson says he was "relaxing during his commute" when "suddenly, and without any warning" the car crashed into a Ford Fiesta that had been left in the travel lane.
"If this had been something more substantial than a Ford Fiesta he wouldn't be here," Hudson's lawyer, Mike Morgan, said at a press conference announcing the lawsuit.
"Hudson became the guinea pig for Tesla to experiment their fully autonomous vehicle," Morgan charged.
According to Morgan, the Model S manual states that "you can engage it over 50 miles an hour, but if you engage it over 50 miles an hour, it's got trouble finding stationary objects and stopped cars. To me, that's a big problem. To me, that means you're selling nothing."
To be fair to Tesla, this problem isn't unique to the company. Most emergency braking systems on the market today won't stop for stationary objects at freeway speeds. These systems are not sophisticated enough to distinguish a stationary object on the road from one that's next to or above the road. So to make the problem easier to handle, the cars may just ignore stationary objects, assuming that the driver will steer around them.
Most of the time, this works well enough. The travel lane on the freeway is only supposed to have moving cars in it. If a driver is using adaptive cruise control to maintain speed with a car ahead, it works great. But it can fail catastrophically in rare circumstances when there's a stationary object directly in the road—or when the car's lane-keeping system gets confused about where the lane is and steers the car into a stationary object next to the road. The latter scenario is what apparently happened in the death of Walter Huang—the car got confused about the location of the lanes, steered the vehicle into a "lane" that wasn't actually a lane, and wound up running directly into a lane divider.
But while a number of car companies have driver-assistance technology with this kind of limitation, Tesla touts the capabilities of its system more aggressively than many of its rivals. Tesla's Autopilot page [tesla.com] has a big banner at the top that says "full self-driving hardware on all cars" across the top. It also features a video of a vehicle navigating through an urban environment with the driver's hands on his lap the entire time.
Savvy observers know that the video is two years old and depicts a research prototype, not the capabilities of shipping Tesla vehicles. But there's no disclaimer in the video or the surrounding text. Hudson says that Tesla's marketing materials led him to believe that Tesla's vehicles can drive themselves on the freeway with minimal human intervention.
In the past, Tesla has insisted that despite recent crashes, drivers are significantly safer with Autopilot engaged than without it. But when we looked into these claims in May, we found that they didn't hold up to scrutiny [arstechnica.com]. We simply don't have good enough data to tell whether the use of Autopilot saves lives on net. And we definitely know that in some cases, Autopilot has made mistakes that a human driver would be unlikely to make.
Update: I added a comment from Tesla to this story. Here's the company's full emailed statement:
We don’t like hearing about any accidents in our cars, and we are hopeful that those involved in this incident are recovering. In this case, the car was incapable of transmitting log data to our servers, which has prevented us from reviewing the vehicle’s data from the accident. However, we have no reason to believe that Autopilot malfunctioned or operated other than as designed. When using Autopilot, it is the driver’s responsibility to remain attentive to their surroundings and in control of the vehicle at all times. Tesla has always been clear that Autopilot doesn’t make the car impervious to all accidents, and Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner’s Manual and Release Notes for software updates.