Tesla's 'Robotaxis' Keep Crashing-Even With Human 'Safety Monitors' Onboard:
Tesla's pilot "robotaxi" program is facing mounting scrutiny after multiple incidents in Austin, Texas, where the company's driverless cars have reportedly been involved in several low-speed crashes despite having human safety monitors on board. Spotted by Electrek and found here on the NHTSA website, both cite federal reports confirming at least four accidents since the fleet quietly began operations this summer.
The NHTSA is already investigating Tesla's Full Self-Driving (FSD) software over erratic traffic behavior, and the robotaxi crashes appear to extend those concerns into Tesla's dedicated autonomous service. The agency said it is reviewing new reports related to these test vehicles as it evaluates whether Tesla's systems meet federal safety standards.
Each Tesla robotaxi currently operates with a safety monitor in the driver's seat, ready to take control if the system fails. But several of the Austin crashes occurred while the vehicles were moving slowly or stationary, one incident involved contact with a fixed object in a parking area. Analysts say this suggests the system's perception and decision-making may not be giving monitors enough time to react, a key issue NHTSA has previously flagged in other FSD-related investigations.
[...] While Tesla's technology ambitions remain unmatched in scale, its safety record continues to trail several competitors in key metrics. A new industry report found that long-term battery reliability may be stronger elsewhere, Tesla ranks behind Kia in overall battery longevity for used EVs and plug-in hybrids, signaling that rivals are quietly catching up in key technical areas.
[...] For Tesla, the robotaxi initiative represents both its boldest gamble and its biggest regulatory risk. Despite years of promises about driverless capability, the company still faces federal oversight, unresolved safety probes, and a string of real-world mishaps that threaten public confidence. Each new incident underscores how complex full autonomy remains, even for a company that dominates global EV sales.
Until Tesla provides transparent data on crash frequency and performance, or demonstrates consistent reliability in live service, its robotaxi fleet will likely remain in testing limbo. For now, the only certainty is that the road to driverless mobility is proving bumpier than Tesla expected.
(Score: 4, Insightful) by Thexalon on Tuesday November 04, @12:45PM (6 children)
It might be, and hear me out, that instead of actually being a genius inventor, the person in charge of Tesla is a reasonably smart liar who makes all sorts of grand promises and pretends that those grand promises will come true any day now in order to keep the stock price as high as possible.
"Think of how stupid the average person is. Then realize half of 'em are stupider than that." - George Carlin
(Score: 1, Insightful) by Anonymous Coward on Tuesday November 04, @12:48PM (1 child)
Hmmm, sounds a lot like Sam Altman as well.
(Score: 3, Insightful) by Thexalon on Tuesday November 04, @12:54PM
As a general rule, I've found that the more grand the promises made by the company's boss, the less those promises match achievable reality. That kind of thing was true for Elizabeth Holmes, and Steve Jobs even when he was selling stuff that was behind rather than ahead of the times.
"Think of how stupid the average person is. Then realize half of 'em are stupider than that." - George Carlin
(Score: 4, Funny) by YeaWhatevs on Tuesday November 04, @01:13PM (1 child)
No. forking. way.
So when will I get that monorail?
(Score: 2) by aafcac on Tuesday November 04, @08:23PM
Right after I get my jetpack.
(Score: 2, Interesting) by Type44Q on Thursday November 06, @12:28AM (1 child)
Elon has shown us at least three things:
1) He delivers revolutionary solutions to transportation problems (i.e. fucking reusable spacecraft and electric cars that don't [have to] suck.
2) His companies are as innovative at manufacturing as was Henry Ford's.
3) He's not a car guy and only grasps the "driving problem' quantitatively (his qualitative grasp is clearly somewhat tenuous - his steering yoke and AI notions prove this).
(Score: 2) by DadaDoofy on Saturday November 08, @06:42PM
The shareholders agree with you. Elon Musk will now be the world's first trilionaire.
https://www.nytimes.com/2025/11/06/business/elon-musk-tesla-pay-vote.html [nytimes.com]
(Score: 0) by Anonymous Coward on Tuesday November 04, @01:11PM
All over downtown Austin, doing endless circles.
(Score: 2, Informative) by Anonymous Coward on Tuesday November 04, @01:32PM (10 children)
> Tesla's ‘Robotaxis' Keep Crashing-Even With Human ‘Safety Monitors' Onboard
People are not good safety monitors when the job is mostly boring. It takes a rare kind of person to continue to maintain situational awareness without actively participating. Here's a paper with some history, https://technologyandsociety.org/its-time-to-rethink-levels-of-automation-for-self-driving-vehicles/ [technologyandsociety.org]
A mechanical test lab that I visit a few times a year has this problem--the tests can last an hour or more, all pre-programmed and automatically controlled. But expensive things can go wrong and the operator has a kill button to stop the test and minimize damage. There are a few good operators, they somehow manage to keep eyes and ears on the test and have a very good (but not perfect) record of stopping tests at the appropriate moment. It takes years to train and qualify these operators, and many don't make the cut.
(Score: 5, Funny) by DannyB on Tuesday November 04, @02:39PM (3 children)
Or . . . Tesla's Full Self Driving AI is able to outsmart the safety monitors.
It's a game. Can I identify an opportunity to crash before the safety monitor can override?
If your boy is chewing on electrical cords, then ground him until he conducts himself properly.
(Score: 0) by Anonymous Coward on Tuesday November 04, @07:18PM (2 children)
Funny mod well deserved.
More seriously, this...
> Can I identify an opportunity to crash before the safety monitor can override?
...is trivial on any road without a center divider--just cross the centerline into opposing traffic and you will crash in almost no time. Certainly no time for a person to react unless they already have their hands on the steering wheel and feel it start to move.
This scenario is one that the self-driving car promoters like to sweep under the rug.
(Score: 2) by PiMuNu on Wednesday November 05, @07:30AM (1 child)
It's a problem that can be fixed, with active roadways. It is a difficult computational problem (and human cognition problem, sometimes) to figure out where the centre dividing line is (in rain, fog, at night, peeling paint, etc). It is a trivial problem to ping locate a vehicle and notify it where the centre dividing line is.
(Score: 1, Informative) by Anonymous Coward on Wednesday November 05, @03:27PM
Till someone gets it wrong or intentionally makes it wrong. Or the line is correct but the other vehicles are not correct (avoiding something else etc). Then the robocar crashes? Where the line is isn't as important as driving safely in a way which reduces the chances of collisions whether for your own vehicle or other vehicles.
Most human drivers can cope with misdrawn lines or even no lines. Some not very well but you just can't 100% rely on the lines, especially when your life is on the line... I would prefer if the robocars aren't merely better than the crappier human drivers.
(Score: 2) by Username on Tuesday November 04, @02:50PM (2 children)
>several of the Austin crashes occurred while the vehicles were moving slowly or stationary
How can they blame Tesla for a crash when their car was stationary?
(Score: 1, Informative) by Anonymous Coward on Tuesday November 04, @05:13PM (1 child)
Q. How can they blame Tesla for a crash when their car was stationary?
A. When the Tesla phantom braked, stopped in traffic with no reason, and got rear ended.
According to a recent talk I listened to, this is extremely common and (depending on jurisdiction) often not even reported.
(Score: 1, Insightful) by Anonymous Coward on Wednesday November 05, @11:56PM
I drive an old car. Big, heavy, paid-for, and incredibly cheap to own by today's standards. I have learned to give Teslas a very generous spacing in front of me, as their braking is far better than mine. I have had several demonstrations. I don't want to tempt fate and a lot of legal work over a few seconds of gap time.
Yes, I get impatient zoom-zooms ( such as BMW ) doing whatever it takes to pass me and close that gap I had made. I won't contest them at all, I will even slow down during their pass. I'd much rather that the impatient zoom-zooms be ahead of me than behind me anyway ... I find it a huge distraction to constantly prepare for erratic zoom drivers doing stupid things that might involve me. Let the insurance companies deal with the cost of underwriting them.
I figure far better them than me tangle with that Tesla.
(Score: 2) by aafcac on Tuesday November 04, @08:27PM (2 children)
I've said it before and I'll say it again, it really should have started out as something along the lines of the human driver driving for 10 minutes and the car driving itself for 5. All the data of what was encountered should have been saved and used as the basis for training the AI. 5 minutes of self-driving is still short enough that most people capable of driving should have little issue in terms of monitoring it. Over time as the car's self driving got better, the portion of the time that the car drives versus the human could increase.
There's also the approach taken by other manufacturers in terms of just working to perfect things like lane centering, emergency auto-braking and adaptive cruise control where the human is still partially driving the entire time. One of the takeaways from when I drove a rented Toyota is that a really good adaptive cruise control paired with emergency auto-braking takes a tremendous amount of the cognitive load off the driver, even while in traffic, that can be used to look for things that the car can't easily predict.
(Score: 2) by ElizabethGreene on Wednesday November 05, @06:02PM (1 child)
They used millions of miles of FSD driving data to train the AI. 10-on 5-off wouldn't even be a drop in the bucket.
(Score: 2) by aafcac on Wednesday November 05, @10:17PM
They used millions of miles of easy conditions without any real oversight as the basis for the models. 10 minutes on and 5 off means that there's actual real oversight and a collection of data that could actually be useful. As opposed to just dumping a "driver" into a dangerous situation with little ability to react.
Sure, it would take rather a long time with a duty cycle like that, but the quality of the data would be a lot better as you could use both the periods with a human driver and the periods with the car driving itself as the basis for training models. But, given the cases where the "AI" has run over motorcyclists, run into overturned concrete trucks and emergency vehicles on the side of the road, it's pretty clear that they used millions of miles of easy situations as the basis for the training.
The whole assumption that there's any intelligent thought going into Tesla's fraudulently named autopilot or full self driving is flat out not something that's very well supported by their track record for killing people.
(Score: 4, Insightful) by Snotnose on Tuesday November 04, @02:33PM (1 child)
Just imagine you've been sitting for an hour, hands off the wheel, feet off the pedals. Where is your brain? I'll bet it's not on that kid in the crosswalk, nor the fire truck blocking the road. No, it's probably wondering how your passengers can stand this type of music, wondering how long til lunch, and how you're gonna buy groceries without SNAP.
Parents in Africa: "Finish your food kids, there are starving children in America"
(Score: 5, Insightful) by Thexalon on Tuesday November 04, @07:13PM
The problem the safety monitor is solving is: When a Tesla crashes, Tesla can fire the safety monitor and blame them rather than blame their own technology for being less than what is advertised.
"Think of how stupid the average person is. Then realize half of 'em are stupider than that." - George Carlin
(Score: 5, Funny) by looorg on Tuesday November 04, @02:48PM (6 children)
Is this when it's driving in the "Mad Max"-mode (apparently that is a thing)? Perhaps it should try and go for a "Driving Miss Daisy"-mode instead. It sounds a lot safer.
(Score: 2) by bzipitidoo on Tuesday November 04, @06:40PM (5 children)
Driving Miss Daisy might be the only movie or TV show driving style I'd want programmed into a real world vehicle!
Have you seen the crazy stuff California drivers do? Motorcycles allowed to run between the lanes. Won't slow down enough for weather, won't wait for enough visibility before attempting to pass, like to follow too close. Can have a fatalistic attitude about it all. After seeing that, I understand how 100 car pileups can happen on I-5. It's like Hollywood infected their brains with its dramatically risky driving styles.
(Score: 2) by looorg on Tuesday November 04, @08:01PM (4 children)
I was in general more surprised that they actually do have a driving mode they call "Mad Max", one wonders if that would be above or below "Carmageddon" -- which sounds more like what they are actually going for.
As much as I like the idea of motorcycles, in theory, it always felt a bit unsafe. That said if I was driving a motorcycle and it was raining I would like to get off the road as soon as possible to. Question is if you speed up and keep going to pull over and take a break and hope the rain stops soon. But it is probably to the surprise of nobody that they only half-jokingly referred to as 'organ donors'.
(Score: 2) by bzipitidoo on Wednesday November 05, @12:10AM (2 children)
They seriously have a "Mad Max" mode? That's not a joke?
One time I was on highway 99 in the central valley when it started to rain. In the space of one minute, I saw a car ahead of me change lanes too aggressively, lose control and spin out and end up in the median facing backwards, another car skid off the road, and emergency vehicles come onto the road racing to yet another accident. I got the H off the road and had a bite to eat while I waited for things to settle down.
I admire the fuel economy motorcycles can have, but yes, they are far too dangerous. Yeah, I've heard bikers called "organ donors". And yes, I believe it is standard practice or at least highly recommended that you don't even try to bike in the rain, you stop and wait until the rain ends.
(Score: 2) by looorg on Wednesday November 05, @12:19AM
Apparently they do. It sounded so stupid and bad. Yet there it apparently is. Multiple links about if you just search for Tesla and Mad Max. This one has images of the UI ... I gather it makes more liberal interpretations of the traffic code.
One hope there is no Thunderdome incident ...
https://www.foxnews.com/tech/tesla-revives-mad-max-mode-full-self-driving [foxnews.com]
(Score: 4, Informative) by ElizabethGreene on Wednesday November 05, @05:57PM
In the new FSD update the modes are Sloth, Chill, Standard, Hurry, and Mad Max. The Sloth and Mad max additions are new in this update, and I haven't used them. 98% of my driving is FSD and 80% of that is on chill. I flip to standard if I don't want it to aggressively return to the rightmost lane. I've been using them for a couple of weeks and they've been fantastic.
(Score: 3, Interesting) by Kilo110 on Thursday November 06, @01:44AM
I ride motorcycles. I can answer that.
The first 15-20 min of rain is the most dangerous. All the oils on the road get lifted up, which makes them slick. After that initial gunk gets washed away, the rain itself isn't a big deal. Modern tires are really good at shedding water and keeping grip. Assuming they're in good shape, properly inflated, etc.
The real big issue is getting wet and riding. That can easily lead to hypothermia and general loss of sensation in extremities. Not a great state to be in when riding when one needs both hands and both feet to control the vehicle.
I keep an emergency rain poncho in my bag although I've yet to use it.
So if I was caught in a sudden storm, I'd find some cover and pull out my rain poncho. I'd then wait about 15-20 min for the oils to wash off the road. Then I'd carefully make my way back while avoiding excess lean on curves.
(Score: 4, Informative) by SomeGuy on Tuesday November 04, @06:59PM (2 children)
Ah, here we are again. We have been down this road a thousand times already, the self driving "AI" keeps circling back down the same street. I know we have been this way before, because there is that marketing clown creaming in his pants again.
But it promises, it PROMISES it will get you there eventually. It *PROMISES*, I tell you!. And all the dumbfucks believe it.
Get these things off the road. Give it up already Dig a hole and bury them. Burn them with fire. Heh, try those around here and it will be Robotaxi hunting season.
(Score: 2) by DannyB on Wednesday November 05, @04:54PM (1 child)
Maybe full self driving will always be ten years away. Like nuclear fusion. Or male contraceptives.
If your boy is chewing on electrical cords, then ground him until he conducts himself properly.
(Score: 2) by aafcac on Wednesday November 05, @10:23PM
I doubt it. Self-driving would be here already if we were willing and able to rebuild all the roads with self-driving cars in mind. Line following robots were already a thing in the '80s, all cars are going to be required to have automatic emergency brakes installed by 2029, my wife's 10 year old ford is capable of parallel parking almost entirely on its own. Between those things the ability to make a car that drives itself safely from place to place is something that could already be done. It's just super expensive and would require that all the human driven cars be removed or segregated from the self-driving ones.
So, we definitely will get there, at least for 99% of the driving eventually, it's more a question of how long and how much of the infrastructure needs to be rebuilt to make it happen.