Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday June 20 2022, @09:55AM   Printer-friendly
from the say-goodnight-elon dept.

While it may not be all that surprising to SN readers, some data on "self driving" cars has now hit the big time, WaPo reports: https://www.washingtonpost.com/technology/2022/06/15/tesla-autopilot-crashes/

Tesla vehicles running its Autopilot software have been involved in 273 reported crashes over roughly the past year, according to regulators, far more than previously known and providing concrete evidence regarding the real-world performance of its futuristic features.

The numbers, which were published by the National Highway Traffic Safety Administration for the first time Wednesday, show that Tesla vehicles made up nearly 70 percent of the 392 crashes involving advanced driver-assistance systems reported since last July, and a majority of the fatalities and serious injuries — some of which date back further than a year. Eight of the Tesla crashes took place before June 2021, according to data released by NHTSA on Wednesday morning.

And 5 of 6 fatalities were linked with Tesla cars, the other was one of the competing Level 2 systems offered by other automakers.

WaPo continues,

The new data set stems from a federal order last summer requiring automakers to report crashes involving driver assistance to assess whether the technology presented safety risks. Tesla's vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

The NHTSA order required manufacturers to disclose crashes where the software was in use within 30 seconds of the crash, in part to mitigate the concern that manufacturers would hide crashes by claiming the software wasn't in use at the time of the impact. [Ed: Emphasis provided by the submitter.]


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Runaway1956 on Monday June 20 2022, @11:44AM (40 children)

    by Runaway1956 (2926) Subscriber Badge on Monday June 20 2022, @11:44AM (#1254578) Journal

    I don't think you're looking at the right statistics. How about accidents per miles traveled? If human drivers have x accidents per million miles traveled, how do self driving cars compare? Any number greater than x is unsatisfactory. Any numbers less than or equal to x are satisfactory. Numbers that are mere fractions of x would be considered excellent.

    That said, I shall never understand why Elon insists that his cars be restricted to a single sensory channel, that is, the visible light spectrum. I'm repeating myself here, but I say give the computer all the sensory input possible. Radar is good, lidar is good, laser is good, sound is good, echolocation might be good, visible light is good, infrared is good, ultraviolet might prove beneficial in some cases. Give the computer three or more channels, don't restrict it to a single channel.

    Elon is just being stubbornly stupid here.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by PiMuNu on Monday June 20 2022, @11:55AM

    by PiMuNu (3823) on Monday June 20 2022, @11:55AM (#1254579)

    I agree - that's why I finished with "there is no evidence for increased danger relative to regular car" or somesuch.

    There are many confounding/systematic biases in this sort of analysis. Another example in addition to yours - one should also look at type of roads travelled; highways are safer than local roads for instance.

  • (Score: 1, Interesting) by Anonymous Coward on Monday June 20 2022, @12:05PM (7 children)

    by Anonymous Coward on Monday June 20 2022, @12:05PM (#1254584)

    > Numbers that are mere fractions of x would be considered excellent.

    Not by me. I don't drive impaired, drive stick (engaged with driving), no smart phone in the car, live in a state with an annual vehicle inspection (even older cars are mostly in good working order), and have a number of other things that are, statistically, in my favor. While nothing is a guarantee, I believe I'm roughly 10x less likely to have a serious accident than your "x".

    Thus the target before I consider self driving is x/10.

    • (Score: 5, Insightful) by maxwell demon on Monday June 20 2022, @12:56PM (3 children)

      by maxwell demon (1608) on Monday June 20 2022, @12:56PM (#1254593) Journal

      You are missing one crucial point: You can have an accident which you didn't cause. So the question is not only whether you would be safer when you are using a self-driving car versus driving yourself, but also whether you are safer if the other cars are self-driving versus driven by humans.

      I don't think privately owned cars will be the primary market for self-driving cars anyway. Rather, self-driving cars will likely mainly replace cars with paid drivers.

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 1, Interesting) by Anonymous Coward on Monday June 20 2022, @01:19PM (2 children)

        by Anonymous Coward on Monday June 20 2022, @01:19PM (#1254597)

        > You can have an accident which you didn't cause.

        Yes, of course. That's why I normally don't:
        + follow closely behind any car, and Teslas in particular since they are prone to slamming on their brakes for false alarms (automatic emergency braking, AEB)
        + deal with rush hour traffic (have been fortunate to work from home most of my life)

        And generally do:
        + "defensive driving" things--watching for erratic behavior (other vehicles, peds, cyclists, deer, etc) and a variety of other things to avoid.
        + attend advanced driver trainings (multiple times) and even a three day race car school to perfect car control skills.

        These things are also part of my approx x/10 demographic. But none of this is a guarantee, getting complacent is not an option.

        • (Score: 0) by Anonymous Coward on Monday June 20 2022, @04:57PM (1 child)

          by Anonymous Coward on Monday June 20 2022, @04:57PM (#1254663)

          Would you accept x/9.5?

          • (Score: 1, Interesting) by Anonymous Coward on Monday June 20 2022, @05:15PM

            by Anonymous Coward on Monday June 20 2022, @05:15PM (#1254671)

            Ha Ha, sure! On a bad day my demographic might only be x/8...

            Hint -- all these numbers are approximate and meant to point out that "as safe as the average driver" is not an acceptable target for self driving. Anyone who survives the high-testosterone years, and, doesn't drive impaired (drugs, texting, etc) is much better than average.

    • (Score: 3, Insightful) by DeathMonkey on Monday June 20 2022, @08:35PM (1 child)

      by DeathMonkey (1380) on Monday June 20 2022, @08:35PM (#1254735) Journal

      Everybody thinks they're an above average driver.

      Averages say at least half of them are wrong!

      • (Score: 0) by Anonymous Coward on Tuesday June 21 2022, @08:31AM

        by Anonymous Coward on Tuesday June 21 2022, @08:31AM (#1254844)

        Actually, that's wrong. Your averages assume that all people are in agreement on what makes an above-average driver. More than 50% of people can be accurate in thinking they're an above-average driver according to their own metrics (it can also be less than 50%).

    • (Score: 2, Redundant) by FatPhil on Tuesday June 21 2022, @08:35AM

      by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Tuesday June 21 2022, @08:35AM (#1254845) Homepage
      x/10 is a mere fraction of x.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 5, Interesting) by TheGratefulNet on Monday June 20 2022, @01:26PM

    by TheGratefulNet (659) on Monday June 20 2022, @01:26PM (#1254599)

    I mostly disagree with runaway on all things, but this time, its a blue moon event - we agree ;)

    you need sensor richness, and fusion is 'hard' but elon has given up and since he hires mostly junior engineers, its just above what they're capable of.

    little known secret, but tesla is one of the worst payers in the market. their starting salaries are 10's of K too low for the bay area. and no one senior that I work with would ever go back to tesla (a lot of people that work in my co. used to be at tesla) and no one who hears stories of that place wants to interview there.

    its because of cost. elon is a cheap bastard. isn't that how it always is? he wont keep parts in stock so you have to wait weeks or months to get your car fixed, he has no one on phones so there's no way to contact people in service other than TEXTING (wut?). you pay $150k or more for a high end car and you still can't call your service center about issues? what a way to treat customers...

    no loaners. if you are lucky you get uber credits when your car breaks down. admittedly they dont break that often but accidents happen and since repair parts are in short supply, and no one makes non-oem parts (yet) for these cars - you better hope you dont need repairs! I avoid driving my car for fun and only drive it when I need to since each drive is a risk I take. someone hits me and I need a new pane of glass or sheet metal and I could be in a rental for months. not a fun car ownership experience.

    and lets not even talk about the regressions in their software stack that has many of us refusing updates. updates remove features. this is tesla, guys..

    --
    "It is now safe to switch off your computer."
  • (Score: 3, Informative) by khallow on Monday June 20 2022, @02:56PM (27 children)

    by khallow (3766) Subscriber Badge on Monday June 20 2022, @02:56PM (#1254619) Journal

    That said, I shall never understand why Elon insists that his cars be restricted to a single sensory channel, that is, the visible light spectrum. I'm repeating myself here, but I say give the computer all the sensory input possible. Radar is good, lidar is good, laser is good, sound is good, echolocation might be good, visible light is good, infrared is good, ultraviolet might prove beneficial in some cases. Give the computer three or more channels, don't restrict it to a single channel.

    I disagree because you vastly increase the maintenance problem. Each additional sensor group is another bunch of failure modes happening. This is the paradox of redundancy. The more redundancy you add, the more likely you are to get failures that impair or disable the system.

    Visual light sensors are already street legal after all.

    • (Score: 2) by Runaway1956 on Monday June 20 2022, @04:03PM (16 children)

      by Runaway1956 (2926) Subscriber Badge on Monday June 20 2022, @04:03PM (#1254649) Journal

      So, which is better? Adding $1000 per year maintenance, or being involved in an accident? I would rather replace two or three sensors annually, than to run over some stupid kid who ran out in front of me to fetch a ball, and the car didn't see him because his shirt was the same color as the parked car he ran from behind. Or, whatever other excuse the software engineers came up with.

      Self driving cars have one major selling point: they are supposed to be safer than human drivers. If the computer is not safer than humans, you have no real selling points.

      Put the array of sensors on the vehicle, wait for the maintenance problems to occur, then go about fixing those problems. Given time, someone will create a better sensor, someone will find a way to better integrate all those sensors, someone else will come up with a better driving program, then yet again, someone will improve the sensitivity of the senors to produce fewer false positives, etc etc. Early adopters pay the cost, of course. That's the way it always has been, no need to change now.

      • (Score: 1, Funny) by khallow on Monday June 20 2022, @04:38PM (5 children)

        by khallow (3766) Subscriber Badge on Monday June 20 2022, @04:38PM (#1254658) Journal

        Adding $1000 per year maintenance, or being involved in an accident?

        That's a lot of maintenance. I'm going with the accident.

        I would rather replace two or three sensors annually, than to run over some stupid kid who ran out in front of me to fetch a ball, and the car didn't see him because his shirt was the same color as the parked car he ran from behind.

        That's not much of a scenario.

        • (Score: -1, Troll) by Anonymous Coward on Monday June 20 2022, @05:13PM (1 child)

          by Anonymous Coward on Monday June 20 2022, @05:13PM (#1254669)

          Externalized costs, not my problem. The kid was probably a dumbass anyway, it's not like we lost a cure for cancer here.

          • (Score: 1) by khallow on Monday June 20 2022, @07:59PM

            by khallow (3766) Subscriber Badge on Monday June 20 2022, @07:59PM (#1254720) Journal

            Externalized costs, not my problem.

            So is that $1k in additional maintenance costs.

        • (Score: 2) by Runaway1956 on Monday June 20 2022, @06:26PM (2 children)

          by Runaway1956 (2926) Subscriber Badge on Monday June 20 2022, @06:26PM (#1254694) Journal

          That's a lot of maintenance. I'm going with the accident.

          Why did I already know your answer to that question?

          That's not much of a scenario.

          We can draw and paint real life scenarios all day long. Snowy day, heavy snowfall, gusting winds blowing the snow around at random. An infrared sensor is almost certainy going to "see" a warm, glowy, human body in all that cold. But, your visible light sensors don't see the guy who just stepped (or even slipped) into the street ahead of you, because the snow is swirling in an opaque wall. But, you don't care about him, as much as you care about replacing a faulty sensor or two.

          That is precisely why I find fault with Elon, as well as yourself. Give the car as many senses as reasonably possible. Without those added senses, the cars can't become any safer than a good human driver.

          • (Score: 0, Troll) by khallow on Monday June 20 2022, @07:56PM (1 child)

            by khallow (3766) Subscriber Badge on Monday June 20 2022, @07:56PM (#1254717) Journal

            But, your visible light sensors don't see the guy who just stepped (or even slipped) into the street ahead of you, because the snow is swirling in an opaque wall. But, you don't care about him, as much as you care about replacing a faulty sensor or two.

            You can kill more people with defective transportation systems than you can with sensor edge cases.

            • (Score: 0) by Anonymous Coward on Monday June 20 2022, @07:58PM

              by Anonymous Coward on Monday June 20 2022, @07:58PM (#1254718)

              Indeed, of the visibility is that bad, you shouldn't be driving. Sensors don't change that, they just change what constitutes as so bad that you're going to run into it over something if you drive.

      • (Score: 0) by Anonymous Coward on Monday June 20 2022, @05:13PM (2 children)

        by Anonymous Coward on Monday June 20 2022, @05:13PM (#1254668)

        Yep, and I'm sure before too long that they'll have the cost of any sensors that need be replacing down to something reasonable, or that it would be covered by not needing to spend so much on insurance. Yes, the example $1k a year is a lot compared with the cost of insurance, but insurance covers potentially hundreds of thousands of dollars in legal expenses when something does happen.

        • (Score: 1) by khallow on Monday June 20 2022, @06:59PM (1 child)

          by khallow (3766) Subscriber Badge on Monday June 20 2022, @06:59PM (#1254702) Journal

          Yes, the example $1k a year is a lot compared with the cost of insurance, but insurance covers potentially hundreds of thousands of dollars in legal expenses when something does happen.

          Does that $1k per year reduce liability or increase it? Consider this all-too-common scenario. The manufacturer's vehicle is involved in a collision. It is found that prior to the crash the lidar system (one of the three sensor channels) had a couple of defective sensors. The owner of the vehicle was ignoring the sensor warning light and the hardware engineered slower speed of the vehicle. They (and the drivers of the other vehicle) argue that they aren't at fault. Everyone sues the manufacturer, arguing that they allowed a malfunctioning vehicle to be auto-drived.

          Consider a second scenario. There's a couple of sensors out in the visual light sensors, the only sensors on the vehicle. Auto-driving is outright disabled with suitable warnings. Since the owner was driving the vehicle, they are deemed at fault. The manufacturer isn't sued.

          • (Score: 0) by Anonymous Coward on Monday June 20 2022, @08:45PM

            by Anonymous Coward on Monday June 20 2022, @08:45PM (#1254744)

            > There's a couple of sensors out in the visual light sensors, the only sensors on the vehicle. Auto-driving is outright disabled with suitable warnings. Since the owner was driving the vehicle, they are deemed at fault. The manufacturer isn't sued.

            Except: I've heard of cases where manufacturer A was sued for not providing something that manufacturers B & C did supply. For example, if B & C supplied redundant cameras, or camera-cleaning systems, A could be sued for not supplying said feature(s). The problem is that the manufacturer usually has the deepest pockets, so they are always sued by product liability lawyers (aka ambulance chasers), along with other parties that are closer to the accident.

      • (Score: 0) by Anonymous Coward on Monday June 20 2022, @05:24PM (6 children)

        by Anonymous Coward on Monday June 20 2022, @05:24PM (#1254675)

        Are all sensors perfectly capable of telling the computer their broken/failing? The more sensors there are, the more combinations of uncommon/hard-to-predict-in-testing failure cases it runs into. Sure the care will be safer when everything's working, but it will be expensive to figure out how it fails when X sensor is in degraded state Y and sensor Z is in state W ... and it doesn't sound like Tesla will be spending that money in closed laboratory testing paid out of their pockets alone.

        • (Score: 2) by Runaway1956 on Monday June 20 2022, @06:32PM (5 children)

          by Runaway1956 (2926) Subscriber Badge on Monday June 20 2022, @06:32PM (#1254696) Journal

          It would be nice if failing sensors inform the computer of that fact.

          But, we don't even need to experience failures for the system to fail. All we need are common occurrences that cause people to fail. Glare off of windows or ice, for instance. Fog, blowing snow, torrential rainfall, even blowing leaves might obstruct your view, and likewise, a visible light sensor's view. Or a dazzling sunrise/sunset. In each case, the sensors may be working perfectly within specs, but they fail to report an obstruction because the environment has gone out of spec.

          • (Score: 1) by khallow on Monday June 20 2022, @07:01PM (3 children)

            by khallow (3766) Subscriber Badge on Monday June 20 2022, @07:01PM (#1254703) Journal

            In each case, the sensors may be working perfectly within specs, but they fail to report an obstruction because the environment has gone out of spec.

            Sorry, in that case, they should detect said obstruction, report that the environment has gone out of spec, and the vehicle should modify its driving behavior appropriately - just like a human driver would in the same situation.

            • (Score: 2) by Runaway1956 on Monday June 20 2022, @11:16PM (2 children)

              by Runaway1956 (2926) Subscriber Badge on Monday June 20 2022, @11:16PM (#1254777) Journal

              Likewise, if a gust of snow laden wind obstructs your view for a critical 2 seconds, you should still detect the obstruction that has moved in front of you, and you should take the appropriate action to avoid the obstruction.

              You're completely failing to make sense today. People frequently have accidents in inclement weather. You can expect sensors to also have accidents more frequently in inclement weather. And, once again, redundant sensors of different types should overcome some of mankind's failings.

              Put visible light, lidar, and one other sensor type on the vehicle, even if it costs khallow an extra thousand dollars.

              • (Score: 1) by khallow on Tuesday June 21 2022, @02:09AM (1 child)

                by khallow (3766) Subscriber Badge on Tuesday June 21 2022, @02:09AM (#1254800) Journal

                Likewise, if a gust of snow laden wind obstructs your view for a critical 2 seconds, you should still detect the obstruction that has moved in front of you, and you should take the appropriate action to avoid the obstruction.

                Indeed. It doesn't take me that long to determine my vision has been blocked. And I would slow down from an already slow speed.

                You're completely failing to make sense today. People frequently have accidents in inclement weather. You can expect sensors to also have accidents more frequently in inclement weather. And, once again, redundant sensors of different types should overcome some of mankind's failings.

                Except when they don't because they fail so often that they are worse than useless.

          • (Score: 2) by ChrisMaple on Monday June 20 2022, @11:05PM

            by ChrisMaple (6964) on Monday June 20 2022, @11:05PM (#1254776)

            Sensors of the sorts used in self-driving cars don't "report", they sense. That means they convert light, or other electromagnetic radiation, or sound, into electrical signals. A processor evaluates those signals to produce what could be called a report, which is then either used in the same processor to initiate action or sent to another processor to do that.

    • (Score: 2) by tangomargarine on Monday June 20 2022, @04:57PM (3 children)

      by tangomargarine (667) on Monday June 20 2022, @04:57PM (#1254664)

      The more different kinds of sensors all talking to the computer, the higher the odds of them disagreeing with each other too, I would think. So when the radar says "there's a semi stopping in front of you" and the LIDAR says "I don't see anything", you have to reconcile the readings.

      Which gets us back to the old issue "do we play it safe and maybe your car comes to a screeching halt when a squirrel runs across the road, or do we play it looser and maybe you rear-end the semi".

      On the other hand, airliners have to be pretty damn complicated with the number of sensors and instruments, and they (usually) work quite well.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 2) by tangomargarine on Monday June 20 2022, @05:05PM (2 children)

        by tangomargarine (667) on Monday June 20 2022, @05:05PM (#1254665)

        On the other hand, airliners have to be pretty damn complicated with the number of sensors and instruments, and they (usually) work quite well.

        Although now that I think about it, probably 97% of those instruments are concerned with the aircraft itself, since you're not worried about heavy traffic most of the time.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 1) by khallow on Monday June 20 2022, @07:41PM (1 child)

          by khallow (3766) Subscriber Badge on Monday June 20 2022, @07:41PM (#1254715) Journal
          Also keep in mind that airliners receive daily maintenance even when they work right. Any sensor that goes bad, gets replaced quickly.
    • (Score: 2) by JoeMerchant on Monday June 20 2022, @07:19PM (5 children)

      by JoeMerchant (3937) on Monday June 20 2022, @07:19PM (#1254709)

      This is a paradox of perception... A system with visual + LIDAR can be as safe as a system with visual only, even when the LIDAR is dysfunctional, and possibly even when the LIDAR is working but there's a bug-splat across all the cameras. But: what good lawyer would stand up and say "it's safe enough if you just have one." No, we must immediately recuse ourselves from responsibility when even one input channel is partly impaired.

      And, yet, we license deaf drivers, drivers who require corrective lenses, drivers with cataracts...

      --
      🌻🌻 [google.com]
      • (Score: 1) by khallow on Monday June 20 2022, @07:54PM (4 children)

        by khallow (3766) Subscriber Badge on Monday June 20 2022, @07:54PM (#1254716) Journal
        This is a huge part of my argument. Risk and liability are poorly handled in today's societies. But even in the absence of that, redundancy doesn't automatically increase reliability or reduce risk.

        It reminds me of a classic quip: a man with two watches doesn't know what time it is.
        • (Score: 2) by JoeMerchant on Monday June 20 2022, @09:13PM (3 children)

          by JoeMerchant (3937) on Monday June 20 2022, @09:13PM (#1254755)

          >redundancy doesn't automatically increase reliability or reduce risk.

          If you handle the information poorly, no.

          >It reminds me of a classic quip: a man with two watches doesn't know what time it is.

          This must make more sense to conservatives. Making decisions is certainly easier when your input information is limited, but wearing blinders doesn't enable better decision making - except to reduce confusion on the part of the decision maker, and that's more of a fault in the decision maker than anything else.

          I might suggest that a little maturity, training, learning how to deal with the additional information, learning how to tell good information from bad, all these things are apparently hard for a lot of people, but constructed decision making systems are not people - they can be taught how to use multiple input channels to check each other, and make at least as good of a decision as a single input system would all the time, but make better decisions when the additional information is available and valuable.

          As long as it's not like my robot lawnmower that packs it up and refuses to move when one sensor is detected out of expected range, but cheerfully mows along when a sensor is stuck in mid-range. Yes, constructed systems can be made like befuddled people, but it's not an actual required property of the constructed system.

          --
          🌻🌻 [google.com]
          • (Score: 1) by khallow on Monday June 20 2022, @09:31PM (1 child)

            by khallow (3766) Subscriber Badge on Monday June 20 2022, @09:31PM (#1254759) Journal

            It reminds me of a classic quip: a man with two watches doesn't know what time it is.

            This must make more sense to conservatives. Making decisions is certainly easier when your input information is limited, but wearing blinders doesn't enable better decision making - except to reduce confusion on the part of the decision maker, and that's more of a fault in the decision maker than anything else.

            Which watch has the right time? And reducing confusion on the part of the decision maker does sound helpful, even if that were the only reason to do this. Let's keep in mind that was the actual purpose of wearing blinders historically - horses and other draft animals could be distracted by stuff happening in their peripheral vision. Blinders helped with that.

            • (Score: 0) by Anonymous Coward on Wednesday June 22 2022, @08:50PM

              by Anonymous Coward on Wednesday June 22 2022, @08:50PM (#1255452)

              > It reminds me of a classic quip: a man with two watches doesn't know what time it is.

              A man with two watches has a choice.
              ftfy

              Reminds me of the wall clock at a popular college hangout (1970s)--it was set ~10 minutes fast on purpose. Even had a sign to that effect taped next to the clock. Was a big help in getting to classes on time!

          • (Score: 0) by Anonymous Coward on Wednesday June 22 2022, @02:14AM

            by Anonymous Coward on Wednesday June 22 2022, @02:14AM (#1255181)

            I think it is the classic fallacy of mistaking an absence of evidence with evidence of absence. If a person has two watches and they disagree, all that adds is provide evidence that one or both of the watches is wrong. With just one watch, you have no such evidence even if it is wrong. The person with one watch doesn't have that additional evidence as to the accuracy of his watch. Consequently, they can't ever know what the time is now thanks to the Gettier Problem. Instead the single watch makes it so they can act as if they know, or believe in the belief as Dennett put it.

  • (Score: 2) by DeathMonkey on Monday June 20 2022, @08:42PM

    by DeathMonkey (1380) on Monday June 20 2022, @08:42PM (#1254740) Journal

    Correct, incidents per mile would be the best metric to compare. NHTSA would likely provide that data if they had it but according to the article they do not have it.

    FTA

    The data does not lend itself easily to comparisons between different manufacturers, because it does not include information such as how many vehicle miles the different driver-assistance systems were used across or how widely they are deployed across carmakers’ fleets.

    So we need more data to make a proper comparison.

  • (Score: 0) by Anonymous Coward on Tuesday June 21 2022, @08:57AM

    by Anonymous Coward on Tuesday June 21 2022, @08:57AM (#1254846)

    You also need to account for highway-miles driven vs city-miles driven. It's well-known that more accidents happen on complex city streets than on simple, relatively straightforward highways. I'm guessing that autopilot miles are predominantly "easy" miles, rather than the difficult kind.