According to Car and Driver: 830,000 Teslas with Autopilot under NHTSA Investigation, Recall Possible
The National Highway Traffic Safety Administration (NHTSA) will take a deeper look into how Tesla vehicles equipped with so-called Autopilot driver assistance software navigate when interacting with first responder vehicles at the scene of a collision. NHTSA said this week that it is upgrading the Preliminary Evaluation it started last August into an Engineering Analysis, which is the next step in a possible recall of hundreds of thousands of Tesla vehicles.
NHTSA said in its notice that it was motivated to upgrade the status of the investigation because of "an accumulation of crashes in which Tesla vehicles, operating with Autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes."
[...]
In a public update on its probe, NHTSA laid out its case for why Autopilot needs to be investigated. NHTSA said it has so far investigated 16 crashes and found that Autopilot only aborted its own vehicle control, on average, "less than one second prior to the first impact" even though video of these events proved that the driver should have been made aware of a potential incident an average of eight seconds before impact. NHTSA found most of the drivers had their hands on the wheel (as Autopilot requires) but that the vehicles did not alert drivers to take evasive action in time.
Related Stories
Tesla has yet another federal headache to contend with. On March 4, the National Highway Traffic Safety Administration's Office of Defects Investigation opened a preliminary investigation after two reports of Tesla Model Y steering wheels detaching in drivers' hands while driving.
NHTSA's ODI says that in both cases, the model year 2023 Model Ys each required repairs on the production line that involved removing their steering wheels. The wheels were refitted but were only held in place by friction—Tesla workers never replaced the retaining bolt that affixes the steering wheel to the steering column. In 2018, Ford had to recall more than 1.3 million vehicles after an incorrectly sized bolt resulted in a similar problem.
The ODI document states that "sudden separation occurred when the force exerted on the steering wheel overcame the resistance of the friction fit while the vehicles were in motion" and that both incidents occurred while the electric vehicles still had low mileage.
Related:
Tesla recalls all cars with FSD (full self driving) option (Elon Tweet:"Definitely. The word "recall" for an over-the-air software update is anachronistic and just flat wrong!")
Feds Open Criminal Investigation Into Tesla Autopilot Claims
NHTSA Investigation Into Telsa Autopilot Intensifies
Tesla's Radar-less Cars Investigated by NHTSA After Complaints Spike
Tesla Under Federal Investigation Over Video Games That Drivers Can Play
Tesla Must Tell NHTSA How Autopilot Sees Emergency Vehicles
NHTSA Opens Investigation into Tesla Autopilot after Crashes with Parked Emergency Vehicles
Tesla Recall is Due to Failing Flash Memory
Tesla Crash Likely Caused by Video Game Distraction
Autopilot Was Engaged In The Crash Of A Tesla Model S Into A Firetruck In LA, NTSB Says
Tesla to Update Battery Software after Recent Car Fires
Tesla Facing Criminal Probe
Former Tesla Employee's Lawyer Claims His Client Was Effectively "SWATted"
NHTSA Finishes Investigation, Declares Tesla Has No Fault in Deadly Crash
Tesla Says Autopilot System Not to Blame for Dutch Crash
(Score: 3, Funny) by Barenflimski on Monday June 13 2022, @05:06AM (1 child)
Maybe all Teslas need flashing lights and sirens to let the first responders know to get out of the way?
"Hey Cap, get out of the way! Here comes another musk-mobile!"
I feel there is an xkcd here...
(Score: -1, Offtopic) by Anonymous Coward on Monday June 13 2022, @05:26AM
Maybe Elon needs to stop messing with the regime's social control mechanism. Leave Twitter alone!
(Score: 4, Interesting) by sgleysti on Monday June 13 2022, @05:29AM (2 children)
Suppose Tesla does need to recall these vehicles... I'd imagine they'll have to figure out how to make this so-called "autopilot" recognize first responder vehicles. But what if they can't do that? Would they have to turn the autopilot off?
This reinforces the point that there's this tough intermediate stage between full manual control and full self-driving where systems like "autopilot" work well most of the time but require humans to be ready to intervene at a moment's notice. That does not work well at all with human psychology, as we're highly prone to trust the driving assist software once we realize it mostly works. It then becomes so much harder to maintain the vigilance required to avoid crashes in the somewhat uncommon situations that the driving assist software cannot handle.
(Score: 2, Insightful) by Anonymous Coward on Monday June 13 2022, @01:01PM
"but require humans to be ready to intervene at a moment's notice."
If the autopilot notifies you there is a problem a second before the crash at highway speed, the required reaction involves time travel.
That is way past incompatible with human psychology.
(Score: 0) by Anonymous Coward on Monday June 13 2022, @04:25PM
Yes. For some time now I've been saying that SAE Level 3 autonomy should be forbidden, unless it can always give sufficient time before the handoff from automatic to human (time tbd--possibly in the 10 seconds to 1 minute range?)
Tesla is already having handoff problems with their Level 2, in part (I believe) because they are making claims that their Level 2 "Autopilot" is actually Level 3.
SAE levels-- https://blog.ansi.org/sae-levels-driving-automation-j-3016-2021/ [ansi.org]
(Score: 0) by Anonymous Coward on Monday June 13 2022, @05:34AM
NHTSA should allow the autopilot to continue if all the affected cars can return to base on their own.
(Score: 5, Insightful) by Booga1 on Monday June 13 2022, @05:50AM (4 children)
I notice that they mention that Autopilot turns itself off right before impact. The cynic in me says they're doing that so they can say "Autopilot was not in use at the time. Control had been passed back to the driver, so there's nothing more we could have done."
My measure of acceptance for self-driving cars is when all accident liability is assigned to the car manufacturer and related software. Barring gross negligence in vehicle maintenance, why should you be on the hook if you're not the one in control?
(Score: 3, Interesting) by inertnet on Monday June 13 2022, @08:35AM
Marketing shortens that to "Autopilot", while emphasis should lie on "driver assistance software". The second part of that quote negates the first part. Drivers who let their car do its thing without paying attention, were mislead by marketing. Although those drivers get blamed in case of accidents, deceiving marketing should be held at least partially responsible.
(Score: 0) by Anonymous Coward on Monday June 13 2022, @12:16PM
'The cynic in me says they're doing that so they can say "Autopilot was not in use at the time. Control had been passed back to the driver, so there's nothing more we could have done."'
I think you are being too kind. It is a much worse situation if one second warning is doing the best the can.
Piloting a vehicle (Driving?) involves looking ahead and forming a mental image of what might be going to happen on the road ahead. The further ahead you look, the more reaction time you have. Highway speeds require much more than one second to stop for something stopped in front of you. If Tesla is doing the best they can, it says that in these instances, their internal model of the world is not able to predict what is going to happen far enough ahead to ensure a safe outcome. That would say they don't know there is a problem until it is too late to do anything about it.
Tesla records a lot of information as you are driving. I'd bet that if they wanted to, they could recreate their internal model just prior to these crashes in a video form to show a jury what they knew and why they didn't alert until it was essentially too late. Tesla makes really good stuff, but it is not perfect. Such a video would be informative but somewhat unfair because it is cherry picking where there is a known bad outcome. How many hours good outcomes where the autopilot is beter than a bored driver make up for 3 seconds where it is not?
(Score: 5, Interesting) by TheGratefulNet on Monday June 13 2022, @01:44PM (1 child)
only tesla who can see the source code can say for sure. but its unlikely that the ap switches off 'just before an accident'.
if that were true and done for defensive reasons (for tesla) it would not be hard to get the code in court discovery and fuck them HARD if they played that game on us. I'd personally be happy to be an expert witness in such a case (as I work in the industry and know it well enough to pull this off. and how I'd LOVE to see elon squirm. I'd actually pay for a chance to see him in a defensive position. but anyway...)
AP is a child driver and has to be treated as such. with AP you have to supervise it. with fsd beta you have to put more than your most intense concentration (in manual driving) when you turn on fsd. it will kill you and mangle your car if you let it. tesla will NOT pay for anything and will blame you, 100%.
I never 'bought' the fsd option. it started out around $7k and now its, what, $10k or $15k? you can 'rent' it too, per month, but either way its a bad idea. it costs you more effort, it stresses you out and makes you an unpaid beta tester for the company. who in their right mind would do that? I WORK in this field and I'd never join such a looney program like that. and, if you fark up, you get kicked out even though you paid a lot to join. elon is laughing himself sick about all the fanboys paying to join a beta^Halpha program like that.
I sure learned a lot by owning a tesla the last 2+ years. and it will be the last one I ever own. glad I learned what its like to be fucked by tesla. its not good, guys. car is fun but the company is complete shit.
"It is now safe to switch off your computer."
(Score: 0) by Anonymous Coward on Tuesday June 14 2022, @08:18AM
well, before you start fucking anyone hard, realize that 1s is not enough time for anything. The main point is that drivers must still be paying attention to the road so they are still liable for the collision. But Tesla could be forced to do things that they don't like to do.
(Score: 0) by Anonymous Coward on Monday June 13 2022, @06:46AM (8 children)
Pretty sure they push updates fairly frequently, this maybe an issue patched long ago.
(Score: 3, Informative) by Nuke on Monday June 13 2022, @08:45AM (1 child)
Doesn't sound like it, the source article is dated only a couple of days ago. Even if Tesla did claim they fixed it, NHTSA is right not to trust them. Would you trust anything that comes out of Elon Musk's mouth, or out of his PR droids?
(Score: -1, Troll) by Anonymous Coward on Monday June 13 2022, @11:34AM
Ok so now I'm not on my phone, I followed the link to the article, and although I didn't read it through it detail, a quick scan seem to indicate that the article had no idea of what software that gets updated even is.
The very first sub title starts off with "Tesla EVs dating back to 2014".
then talking about "recalls" due to software that can and does get updated is...
IDK, what is this, a series of tubes?
That being said, "Autopilot" is likely different from the FSD tests, which the article didn't seem to differentiate between. I might be wrong on the frequency of updates to autopilot if this is the case, however I'd be surprised if this really was a problem that couldn't be patched remotely.
TBH it just sounds like another hit piece.
(Score: 4, Insightful) by Booga1 on Monday June 13 2022, @10:36AM (1 child)
They couldn't get it right before, so why would anyone think they've got it right now?
(Score: 0) by Anonymous Coward on Tuesday June 14 2022, @04:25AM
Like how they didn't land the falcon in the first attempts?
(Score: 5, Interesting) by TheGratefulNet on Monday June 13 2022, @01:39PM (3 children)
updates are not pushed, exactly. you have to allow them as an owner. (model 3 owner, here)
I've denied updates for over 2 years. yes, I know what I'm doing, thank you.
point is: its not forced. they nag you each time you get in the car but so far, in those 2 years, I've NEVER had an update truly forced. nags, yes, they are there every day. I close the nag box and get on with my drive. the next buyer for this car will likely be an enthusiast who dreamed of having this older version and there's 0.0% chance of getting it the authorized way since there is no rollback or undo feature. once you take fw, its there and undoable. its why I resisted the updates. they all were regressions. ALL of them. I read the release notes. I know what I'm doing ;)
anyway, there is no need to return the car, you simply accept an update.
NOTE: you cant take patches. they dont offer that. its a full os+apps or nothing. that's another reason I refuse it. (and my radar still works. no one else's does, who takes the later updates even if their car had the radar hw on it!)
"It is now safe to switch off your computer."
(Score: 0) by Anonymous Coward on Monday June 13 2022, @04:40PM (2 children)
> they all were regressions
Just curious, if your car goes into a Tesla shop for service, do they automatically update all the software too?
That would wipe out your careful preservation of the older software version that you are so carefully maintaining.
(Score: 4, Interesting) by TheGratefulNet on Monday June 13 2022, @05:35PM (1 child)
if your car goes into a Tesla shop for service, do they automatically update all the software too
I think that they are 99% likely to upgrade me no matter what I tell them or ask them or have it noted on my svc ticket.
its a real reason I avoid going there. there's a small thing I wanted done but I dare not, given that they'll force some broken os update on me.
guys, realize that tesla is not operating in the interest of the user. they wont listen to what I want. this is one reason I'm going to find another car and this will be my first/last/only tesla.
when I finally need work done, I'll find a shop that is not tesla and have them do it. for tires and things like that, no need for the vendor.
but your point is WELL taken and I'm well aware of this. its another reason my car is a ticking time bomb. if my car needs service and it cant wait, I'm done for. bye bye radar and hello 'vision' even though my radar is still there and today, still quite good at what it does. updates afterwards all disable radar and go 'vision only' which I REFUSE to do.
"It is now safe to switch off your computer."
(Score: 0) by Anonymous Coward on Monday June 13 2022, @06:01PM
As I suspected, thanks for the answer and good luck!