Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by janrinok on Thursday February 26, @09:17PM   Printer-friendly

Tesla 'Robotaxi' adds 5 more crashes in Austin in a month:

Tesla has reported five new crashes involving its "Robotaxi" fleet in Austin, Texas, bringing the total to 14 incidents since the service launched in June 2025. The newly filed NHTSA data also reveals that Tesla quietly upgraded one earlier crash to include a hospitalization injury, something the company never disclosed publicly.

The new data comes from the latest update to NHTSA's Standing General Order (SGO) incident report database for automated driving systems (ADS). We have been tracking Tesla's Robotaxi crash data closely, and the trend is not improving.

Tesla submitted five new crash reports in January 2026, covering incidents from December 2025 and January 2026. All five involved Model Y vehicles operating with the autonomous driving system "verified engaged" in Austin.

The new crashes include a collision with a fixed object at 17 mph while the vehicle was driving straight, a crash with a bus while the Tesla was stationary, a collision with a heavy truck at 4 mph, and two separate incidents where the Tesla backed into objects, one into a pole or tree at 1 mph and another into a fixed object at 2 mph.

As with every previous Tesla crash in the database, all five new incident narratives are fully redacted as "confidential business information." Tesla remains the only ADS operator to systematically hide crash details from the public through NHTSA's confidentiality provisions. Waymo, Zoox, and every other company in the database provide full narrative descriptions of their incidents.

Buried in the updated data is a revised report for a July 2025 crash (Report ID 13781-11375) that Tesla originally filed as "property damage only." In December 2025, Tesla submitted a third version of that report upgrading the injury severity to "Minor W/ Hospitalization."

This means someone involved in a Tesla "Robotaxi" crash required hospital treatment. The original crash involved a right turn collision with an SUV at 2 mph. Tesla's delayed admission of hospitalization, five months after the incident, raises more questions about its crash reporting, which is already heavily redacted.

With 14 crashes now on the books, Tesla's "Robotaxi" crash rate in Austin continues to deteriorate. Extrapolating from Tesla's Q4 2025 earnings mileage data, which showed roughly 700,000 cumulative paid miles through November, the fleet likely reached around 800,000 miles by mid-January 2026. That works out to one crash every 57,000 miles.

The irony is that Tesla's own numbers condemn it. Tesla's Vehicle Safety Report claims the average American driver experiences a minor collision every 229,000 miles and a major collision every 699,000 miles. By Tesla's own benchmark, its "Robotaxi" fleet is crashing nearly 4 times more often than what the company says is normal for a regular human driver in a minor collision, and virtually every single one of these miles was driven with a trained safety monitor in the vehicle who could intervene at any moment, which means they likely prevented more crashes that Tesla's system wouldn't have avoided.

Using NHTSA's broader police-reported crash average of roughly one per 500,000 miles, the picture is even worse, Tesla's fleet is crashing at approximately 8 times the human rate.

Meanwhile, Waymo has logged over 127 million fully driverless miles, with no safety driver, no monitor, no chase car, and independent research shows Waymo reduces injury-causing crashes by 80% and serious-injury crashes by 91% compared to human drivers. Waymo reports 51 incidents in Austin alone in this same NHTSA database, but its fleet has driven orders of magnitude more miles in the city than Tesla's supervised "robotaxis."

[...] We keep updating this story because the data keeps getting worse. Five more crashes, a quietly upgraded hospitalization, and total narrative redaction across the board, all from a company that claims its autonomous driving system is safer than humans.

Tesla fans and shareholders hold on to the thought that the company's robotaxis are not responsible for some of these crashes, which is true, even though that's much harder to determine with Tesla redacting the crash narrative on all crashes, but the problem is that even Tesla's own benchmark shows humans have fewer crashes.

The 14 crashes over roughly 800,000 miles yield a crash rate of one crash every 57,000 miles. Tesla's own safety data indicate that a typical human driver has a minor collision every 229,000 miles, whether or not they are at fault.

By the company's own numbers, its "Robotaxi" fleet crashes nearly 4 times more often than a normal driver, and every single one of those miles had a safety monitor who could hit the kill switch. That is not a rounding error or an early-program hiccup. It is a fundamental performance gap.

What makes this especially frustrating is the lack of transparency. Every other ADS company in the NHTSA database, Waymo, Zoox, Aurora, Nuro, provides detailed narratives explaining what happened in each crash. Tesla redacts everything. We cannot independently assess whether Tesla's system was at fault, whether the safety monitor failed to intervene in time, or whether these were unavoidable situations caused by other road users. Tesla wants us to trust its safety record while making it impossible to verify.

The craziest part is that Tesla began offering rides without a safety monitor in Austin in late January 2026, just after it experienced 4 crashes in the first half of the month.

As we reported in our status check on the program yesterday, the service currently has roughly 42 active cars in Austin with below 20% availability and the rides with safety monitor are extremely limited and not running most of the time, but it's still worrisome that Tesla would even attempt that knowing its crash rate is still higher than human drivers with a safety monitor in the front passenger seat.


Original Submission

This discussion was created by janrinok (52) for logged-in users only. Log in and try again!
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2, Informative) by Anonymous Coward on Thursday February 26, @09:38PM

    by Anonymous Coward on Thursday February 26, @09:38PM (#1435064)

    The lack of transparency should be no surprise. The man pays tribute to the king. The only frustrating part is knowing that nothing is going to change.

  • (Score: 5, Interesting) by vux984 on Thursday February 26, @11:34PM (3 children)

    by vux984 (5045) on Thursday February 26, @11:34PM (#1435071)

    By the company's own numbers, its "Robotaxi" fleet crashes nearly 4 times more often than a normal driver, and every single one of those miles had a safety monitor who could hit the kill switch. That is not a rounding error or an early-program hiccup. It is a fundamental performance gap.

    I'm no fan of Tesla, and even less so of automated driving fleets on our streets in the state they are in ... but this reporting seems extremely biased.
    Of the 5 new incidents we have 3 under 2mph hitting fixed objects. To then claim that it crashes 4 times more often than a normal driver is pretty silly - i doubt the average normal driver reports sub 2mph incidents with fixed objects to anybody, ever.

    This means someone involved in a Tesla "Robotaxi" crash required hospital treatment. The original crash involved a right turn collision with an SUV at 2 mph. Tesla's delayed admission of hospitalization, five months after the incident, raises more questions about its crash reporting, which is already heavily redacted.

    A 2mph collision that require hospitalization? That's granny with a cane walking speed. Between a Tesla Y and an "SUV"? I agree with the articles criticism that Tesla's redacted crash reporting is not in the public interest and is extremely worthy of criticism, but at the same time, a hospitalization after a 2mph collision in a modern vehicle?? That needs context? Either someone went to the hospital for a stubbed toe, or there was a pedestrian caught between them.

    • (Score: 2, Touché) by Anonymous Coward on Friday February 27, @05:33AM

      by Anonymous Coward on Friday February 27, @05:33AM (#1435098)

      OR the mph figures only apply to the Tesla. How fast was the SUV going?

    • (Score: 4, Insightful) by Spamalope on Friday February 27, @02:22PM (1 child)

      by Spamalope (5233) on Friday February 27, @02:22PM (#1435110) Homepage

      Yep.
      Since all things Elon became political tribalism you can't blindly accept reporting in either direction.

      You need a comparison that includes severity, a way to filter for 'accidents' that wouldn't be reported for a human driver, reports/incidents that are payout motivated 'i.e. cause one to get paid' scammers...

      Also, you've got to compare with taxi accident rates. Accident rates are influenced by activity.

      • (Score: 2) by aafcac on Friday February 27, @04:29PM

        by aafcac (17646) on Friday February 27, @04:29PM (#1435127)

        It's not that, you had to do that even before we knew about the Nazi stuff. The man is a massive fraud and liar. We would have had NYC to LA autonomously driven vehicles by now if he had been able to deliver on his lies.

  • (Score: 1, Troll) by Anonymous Coward on Friday February 27, @12:10AM (4 children)

    by Anonymous Coward on Friday February 27, @12:10AM (#1435074)

    Waymo had its fair share of crashes and fuckups. Face it, you hate Elon because he's right wing. Few ragged on him this much before the trump pivot. When he was a good little leftist, he was electric car jesus. Now there's hit pieces and people screeching in regards to any company he is involved with. Even true believers setting teslas on fire.

    • (Score: 3, Interesting) by Thexalon on Friday February 27, @02:33AM (3 children)

      by Thexalon (636) on Friday February 27, @02:33AM (#1435090)

      I think Waymo also deserves to be ridiculed. It turns out that by a lot of metrics self-driving vehicles currently aren't as good as human drivers. Until they are statistically better than human drivers, they probably shouldn't be on our streets.

      As for Elon, given the degree to which he has made himself a public figure, including volunteering to have a political role at DOGE and having his companies make substantial portions of their revenue from the government, of course public opinion is going to exist around him. And like most public figures, there's a lot to criticize.

      --
      "Think of how stupid the average person is. Then realize half of 'em are stupider than that." - George Carlin
      • (Score: 3, Interesting) by aafcac on Friday February 27, @04:28AM (2 children)

        by aafcac (17646) on Friday February 27, @04:28AM (#1435093)

        I've said it before and I'll say it again, I think a massive portion of the training should be on actual driving by real people and then just work out which maneuvers were safe and which aren't. You can train the models to do more of the safe stuff and less of the dangerous stuff.

        It's definitely not an easy thing to solve, but it would probably get the same basic place you'd get with actual robot cars training on the streets.

        • (Score: 2, Informative) by Anonymous Coward on Friday February 27, @04:57AM (1 child)

          by Anonymous Coward on Friday February 27, @04:57AM (#1435096)

          > actual driving by real people

          Yes, I recall you writing this before. Tesla was recording how their customers drove when using less complex driver aids, like automatic cruise control. As soon as Tesla had cameras on/in their cars, they were sending data (video, steering, speed, accelerometers, etc) home to Tesla. This started well before they offered their mis-named "Autopilot" option. In at least one of his tech-day videos, Elon bragged that his self-driving project had a big advantage over his competitors because of all the video and other data he was collecting.

          The problem is still that you need an enormous sample, the first million miles is barely getting warmed up. The frequency of dangerous situations and actual crashes is low compared to the 3 Trillion (yes, 3000 Billion) miles driven yearly in the USA. I'll say this again--do you know anyone that was an actual eyewitness to a fatal accident? Compared to the whole system they are very rare.

          • (Score: 2) by aafcac on Friday February 27, @04:34PM

            by aafcac (17646) on Friday February 27, @04:34PM (#1435131)

            You need an enormous amount of data regardless of what way you go about training. Pretty much the only way you can responsibly get it is the way I said. I'm pretty sure that Ford, Toyota, Subaru and most of the others that aren't negatively in the news for issues are doing something similar. Americans drive trillions of miles a year in total, so there's plenty of driving to get massive amounts of data as more of these cars start driving those miles.

  • (Score: 1, Interesting) by Anonymous Coward on Friday February 27, @04:07AM (1 child)

    by Anonymous Coward on Friday February 27, @04:07AM (#1435091)

    There are lots of drivers who have driven for years or even decades without having any crashes. And when they do have such a crash it could be because of the crappy driver hitting them.

    I'm not that safe a driver and I have not had a crash for more than 5-10 years. And that last one was the other driver at fault for not looking and suddenly side swiping me.

    If I'm going to die in a self-driven car, I don't want my last thoughts to include "Even I could have avoided that accident".

    "Comparing Real-World Behaviors of Drivers With High versus Low Rates of Crashes and Near-Crashes"
    https://www.nhtsa.gov/sites/nhtsa.gov/files/811091.pdf [nhtsa.gov]
    See: Figure 9. The Frequency of At-fault Crashes and Near-crashes per MVMT by Driver

    Also:

    The 100-Car Naturalistic Driving Study (Dingus et al., 2006) database provides a unique
    opportunity to compare those drivers who were excessively involved in crashes/near-crashes
    with those drivers who were not involved in any type of traffic conflict. The drivers in the 100­
    Car Study demonstrated high variability in driving performance and crash involvement. It
    should be noted that a crash in the 100-Car Study was operationally defined as any physical
    contact with a vehicle, object, or pedestrian, which also included high-g tire strikes (hitting a
    curb while traveling over 35 mph).

    The results indicated that 7 percent of the drivers were not
    involved in any crashes, near-crashes, or incidents, while the worst 7 percent of drivers were
    involved in at least three crashes or minor collisions within a 12-month data collection period.

    "An Overview of the 100-Car Naturalistic Study and Findings"
    https://www.nhtsa.gov/sites/nhtsa.gov/files/100car_esv05summary.pdf [nhtsa.gov]

    Since it was possible to detect all crashes
    regardless of severity, it is interesting to note the
    large number of drivers who experienced one or more
    collisions during the 12 to 13 month data collection
    period. Of all drivers, 7.5% of drivers never
    experienced an event of any severity. In contrast,
    7.4% of the drivers experienced many incidents and 3
    or 4 crashes. Thus, a handful of subjects were either
    very risky drivers or very safe, with the majority of
    drivers demonstrating a relatively normal distribution
    of events across the data collection period.

    As for Waymo, there was a recent incident where it hit a kid dashing out from behind an SUV ( https://www.roadandtrack.com/news/a70189392/nhtsa-investigation-waymo-child-hit-by-autonomous-vehicle-near-elementary-school/ [roadandtrack.com] ) .

    Do Waymo cars use sensors near bumper/headlight height to look for legs, animals, moving objects under vehicles? I can't find the Waymo car POV video of that incident.

    Because a safe human driver also looks for stuff UNDER VEHICLES. You see some kids legs under an SUV, indicating the kid is running into your path, you stop. If the self-driving car uses sensors at bumper/headlight height it can see further under tall vehicles than a human driver can. Most tall vehicles that would hide obstacles from human driver eye heights have higher ground clearance that will allow bumper level sensors to see further under them.

    • (Score: 1, Interesting) by Anonymous Coward on Friday February 27, @05:11AM

      by Anonymous Coward on Friday February 27, @05:11AM (#1435097)

      > I'm not that safe a driver

      Assume you aren't a teen with raging hormones and little driving experience, or a senior with badly failing senses.
              If we start from that base, and if you don't drive drunk/high than there's a very good chance you are quite a bit safer than the average. If you also don't talk on the cell/mobile while driving, add another good bit of "safer" for not being distracted. I suspect just avoiding these three behaviors puts you in the top 10% of the driver population.

      In very round numbers, I think self-driving has to be about 10X safer than the average driver...to match the statistical safety record of the drivers described above.

      Personally, I'm not even tempted to talk on a cell phone because I don't own one. On trips, I usually have my better half with me and she has the cell phone.

  • (Score: 2) by jelizondo on Friday February 27, @04:48AM

    by jelizondo (653) Subscriber Badge on Friday February 27, @04:48AM (#1435095) Journal

    Does it get better or do you simply sue [engadget.com] those who dare tell the truth?

  • (Score: 2) by jb on Friday February 27, @06:03AM

    by jb (338) on Friday February 27, @06:03AM (#1435100)

    Buried in the updated data is a revised report for a July 2025 crash (Report ID 13781-11375) that Tesla originally filed as "property damage only." In December 2025, Tesla submitted a third version of that report upgrading the injury severity to "Minor W/ Hospitalization."

    The two reports can be reconciled with each other only if the person injured was a slave. That seems somewhat less likely than one or other of the reports simply being an outright lie, but a possibility nevertheless.

(1)