Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday August 18 2021, @08:40AM   Printer-friendly

US safety regulator opens investigation into Tesla Autopilot following crashes with parked emergency vehicles – TechCrunch:

U.S. auto regulators have opened a preliminary investigation into Tesla's Autopilot advanced driver assistance system, citing 11 incidents in which vehicles crashed into parked first responder vehicles while the system was engaged.

The Tesla vehicles involved in the collisions were confirmed to have either have had engaged Autopilot or a feature called Traffic Aware Cruise Control, according to investigation documents posted on the National Highway Traffic and Safety Administration's [(NHTSA)] website. Most of the incidents took place after dark and occurred despite "scene control measures," such as emergency vehicle lights, road cones and an illuminated arrow board signaling drivers to change lanes.

"The investigation will assess the technologies and methods used to monitor, assist, and enforce the driver's engagement with the dynamic driving task during Autopilot operation," the document says.

The investigation covers around 765,000 Tesla vehicles that span all currently available models: Tesla Model Y, Model X, Model S and Model 3. The 11 incidents or fires resulted in 17 injuries and one fatality. They occurred between January 2018 and July 2021.


Original Submission

Related Stories

Feds Open New Tesla Probe After Two Model Y Steering Wheels Come Off 17 comments

https://arstechnica.com/cars/2023/03/tesla-under-new-federal-investigation-for-steering-wheels-that-detach/

Tesla has yet another federal headache to contend with. On March 4, the National Highway Traffic Safety Administration's Office of Defects Investigation opened a preliminary investigation after two reports of Tesla Model Y steering wheels detaching in drivers' hands while driving.

NHTSA's ODI says that in both cases, the model year 2023 Model Ys each required repairs on the production line that involved removing their steering wheels. The wheels were refitted but were only held in place by friction—Tesla workers never replaced the retaining bolt that affixes the steering wheel to the steering column. In 2018, Ford had to recall more than 1.3 million vehicles after an incorrectly sized bolt resulted in a similar problem.

The ODI document states that "sudden separation occurred when the force exerted on the steering wheel overcame the resistance of the friction fit while the vehicles were in motion" and that both incidents occurred while the electric vehicles still had low mileage.

Related:
Tesla recalls all cars with FSD (full self driving) option (Elon Tweet:"Definitely. The word "recall" for an over-the-air software update is anachronistic and just flat wrong!")
Feds Open Criminal Investigation Into Tesla Autopilot Claims
NHTSA Investigation Into Telsa Autopilot Intensifies
Tesla's Radar-less Cars Investigated by NHTSA After Complaints Spike
Tesla Under Federal Investigation Over Video Games That Drivers Can Play
Tesla Must Tell NHTSA How Autopilot Sees Emergency Vehicles
NHTSA Opens Investigation into Tesla Autopilot after Crashes with Parked Emergency Vehicles
Tesla Recall is Due to Failing Flash Memory
Tesla Crash Likely Caused by Video Game Distraction
Autopilot Was Engaged In The Crash Of A Tesla Model S Into A Firetruck In LA, NTSB Says
Tesla to Update Battery Software after Recent Car Fires
Tesla Facing Criminal Probe
Former Tesla Employee's Lawyer Claims His Client Was Effectively "SWATted"
NHTSA Finishes Investigation, Declares Tesla Has No Fault in Deadly Crash
Tesla Says Autopilot System Not to Blame for Dutch Crash


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by Snotnose on Wednesday August 18 2021, @12:22PM (3 children)

    by Snotnose (1623) on Wednesday August 18 2021, @12:22PM (#1168155)

    When I see a bunch of emergency vehicles I automatically perk up and pay max attention. One would think if your car is in self driving mode, at the sight of emergency vehicles the driver would perk up and take manual control.

    Then again, I have more sense than money.

    --
    Is anyone surprised ChatGPT got replaced by an A.I.?
    • (Score: 0) by Anonymous Coward on Wednesday August 18 2021, @12:31PM

      by Anonymous Coward on Wednesday August 18 2021, @12:31PM (#1168156)

      Follow on: You might also think that the Tesla Autopilot software would notice flashing lights and start an organized shutdown, forcing a handoff back to the person sitting in the driver's seat. But it would appear that didn't happen, at least in the accidents under investigation.

    • (Score: 2) by krishnoid on Wednesday August 18 2021, @06:34PM (1 child)

      by krishnoid (1156) on Wednesday August 18 2021, @06:34PM (#1168241)

      "Autopilot", "self driving", these should just be blacklisted (oops racist) I mean retired (oops ageist/workist) I mean not used. You hear it, you repeat it, you think it, you end up letting the general supervisory universal steering (G-SUS) mechanism take the wheel. How about something meaning "hormonal impulsive nigh-invulnerable underage-drinking teenage student driver system" instead? You'd sure as hell have people *always* ready to take the wheel.

      • (Score: 1, Touché) by Anonymous Coward on Wednesday August 18 2021, @08:51PM

        by Anonymous Coward on Wednesday August 18 2021, @08:51PM (#1168304)

        The word you're looking for is "strawmanned".

  • (Score: 0) by Anonymous Coward on Wednesday August 18 2021, @02:00PM (3 children)

    by Anonymous Coward on Wednesday August 18 2021, @02:00PM (#1168175)

    Wonder if a Tesla is safer than a human. Sometimes it doesn't seem that high of a bar.

    I see a lot of folks looking down at their cell on the highway.

    OTOH, just expecting pattern recognition to work with things it has seen seems an impossible design path.

    • (Score: 0) by Anonymous Coward on Wednesday August 18 2021, @06:49PM (2 children)

      by Anonymous Coward on Wednesday August 18 2021, @06:49PM (#1168246)

      > Wonder if a Tesla is safer than a human.

      Which human? The distracted texter checking their makeup...or the engaged driver using a manual transmission? I'm the latter, not interested in any of this self-driving crap until it is about "as safe as" my demographic.

      ps. I do understand that these things are statistical and my driving style is no guarantee of anything.

      • (Score: 0) by Anonymous Coward on Wednesday August 18 2021, @08:57PM

        by Anonymous Coward on Wednesday August 18 2021, @08:57PM (#1168306)

        Sir, I will step off your fringed poppy meadow.

      • (Score: 0) by Anonymous Coward on Thursday August 19 2021, @04:24AM

        by Anonymous Coward on Thursday August 19 2021, @04:24AM (#1168431)

        I don't wanna ride with that Hartford Insurance rep as shown on TV.

        He's not paying much attention to his driving.

        Seems he looks at his passenger more than he does to the road.

  • (Score: 0) by Anonymous Coward on Wednesday August 18 2021, @05:42PM

    by Anonymous Coward on Wednesday August 18 2021, @05:42PM (#1168228)

    His pursuit of "self driving" is going to make "willful disregard for customer safety" become part of his personal brand, and none of his companies need that.

  • (Score: 4, Insightful) by Runaway1956 on Wednesday August 18 2021, @08:00PM (3 children)

    by Runaway1956 (2926) Subscriber Badge on Wednesday August 18 2021, @08:00PM (#1168270) Journal

    WTF won't Musk put radar and/or lidar and/or laser and/or infrared into his cars?

    Most of us humans understand that under certain conditions, our vision is worse than worthless. Vision can not only fail, but vision and past experience can trick us into doing exactly the wrong thing, at the wrong time. Why can't Musk understand that vision is fallible?

    Give the car more sensors, which translates to more data which should translate to better accident avoidance.

    That said - it's necessary to understand WHY the cars run into vehicles with flashing lights at night. Are they experiencing almost the same thing humans experience when bright lights are flashed in their eyes at night? Whatever depth perception you had before that first flash hits, is gone. Your night vision is shot to hell. All you can see is the afterglow of the flash, and darkness and shadows all around that glow - until the next flash blinds you again.

    Seriously, Musk needs to pull his head out on this issue. Relying on one small set of sensors that have been proven to be unreliable is just stupid. There's probably little if anything wrong with the navigation computer. The problem is a basic design flaw. Give the computer more data!

    --
    “I have become friends with many school shooters” - Tampon Tim Walz
    • (Score: 0) by Anonymous Coward on Wednesday August 18 2021, @10:03PM

      by Anonymous Coward on Wednesday August 18 2021, @10:03PM (#1168338)

      But he's got many thousands of people willing to gather all sorts of real-world data for him. These little issues are just errors that need to be optimized out. Adding more sensors means a lot more testing and development and delays for him. Why do all that when you've got all these people throwing their own money at him and giving him the data for free? Sure you've got to break a few eggs making that omelet (what's a few bodies here and there, these let's call them "heroes" to be lauded in the great Muskie advancement, like the poor creatures who will get to drive the first Tesla on Mars, bask in their 15 minutes of glory, then die horrible deaths on an inhospitable world), but with the right legal indemnification clauses, there's no downside for him.

    • (Score: 1, Funny) by Anonymous Coward on Thursday August 19 2021, @05:27AM

      by Anonymous Coward on Thursday August 19 2021, @05:27AM (#1168441)

      WTF won't Musk put radar and/or lidar and/or laser and/or infrared into his cars?

      Because then he would have to admit that he was wrong.

    • (Score: 3, Insightful) by Joe Desertrat on Thursday August 19 2021, @08:20PM

      by Joe Desertrat (2454) on Thursday August 19 2021, @08:20PM (#1168616)

      Are they experiencing almost the same thing humans experience when bright lights are flashed in their eyes at night?

      Most camera sensors I've had any experience with tend to react badly to suddenly changing light. While a DSLR or security camera taking a second or two to readjust when the lights are turned on in a room is not a big deal, there could be drastic consequences for a vehicle traveling down a road in that same amount of time.

  • (Score: 2) by Phoenix666 on Thursday August 19 2021, @03:16PM (1 child)

    by Phoenix666 (552) on Thursday August 19 2021, @03:16PM (#1168526) Journal

    I took an extended trip in a Model Y in the Rockies last week, 2/3 on two-lane highways, and 1/3 on the interstate. It had the full self-driving package and we adjusted the finer-grained settings to try and figure out the optimal mix.

    On the 2-lane roads we had to turn off the front collision detection because every time a semi came the other way the Tesla hit the brakes abruptly. On the interstate the lane detection worked great and was even able to exit and enter the freeway without a hitch; however, on the 2-lane road we ran into some less optimal behavior.

    That is, after we got off the freeway we headed up the two-lane road through a series of S-curves in a coulee. On the first blind curve a deer burst out of the ditch and darted right in front of us. It was just after twilight and another car with LED headlights had just passed the other way and ruined my night vision for a moment, so I didn't see it and would have hit it square on, but the Tesla emergency collision detection hit the brakes and saved the day.

    A half mile further on, more deer appeared in front of us, close to the center line. I saw them and tried to steer right onto the shoulder to avoid them, but the Tesla lane detection fought me hard. We wound up missing those deer by mere inches, accordingly.

    We were keyed up and totally focused on the road at that point, but the Tesla auto-pilot nag wanted to keep pulling our eyes off the road onto the center console every 30 seconds, breaking our line of sight and ruining our night vision with the bright display. The auto-pilot nag doesn't just require you to keep your hands on the wheel, but to jiggle it slightly. That's not a big deal on an interstate, but on a curvy 2-lane road at night it's far less than optimal. So we turned all of it off and went the rest of the way on full manual.

    So the report is mixed. The Tesla self-driving features both saved us and almost crashed us because it thought it was smarter than we were (or perhaps because the Tesla programmers erred on the side of caution in given situations). You still have to pay attention, and I suspect you have to work with the different settings until you find a muscle-memory friendly mix that works for you. Emergency vehicles with their lights activated are not, however, hard to see or hard to avoid, so if a person in a Tesla crashed into them it says to me they were not paying attention at all.

    It also raises the question: in some states the law says you have to pull over into the far lane and slow down when emergency vehicles are present on the side of the road. Will the Tesla auto-pilot have to implement that everywhere, even where it isn't the law, or will they have to make it location aware so that its behavior around emergency vehicles will adapt?

    --
    Washington DC delenda est.
    • (Score: 2) by fraxinus-tree on Thursday August 19 2021, @07:55PM

      by fraxinus-tree (5590) on Thursday August 19 2021, @07:55PM (#1168611)

      Bright! Blue-ish! Dashboard! Lights! In every car from late 1990s on. Probably they sell, but they are a damnation to every blue-eyed driver (and to lesser extent, to everyone else) at night. Yes, one can generally dim them - but the color temperature doesn't change so you are fscked anyway.

(1)