Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Saturday January 21 2023, @12:26AM   Printer-friendly
from the you-weren't-supposed-to-take-us-literally dept.

The claim was made in a lawsuit over Walter Huang's fatal Model X crash in 2018:

Tesla's widely viewed 2016 Autopilot demonstration video showing the system stopping for red lights and moving off again when the light changed to green was faked, according to the director of Autopilot software, Ashok Elluswamy. Elluswamy made the statement under oath during a deposition for a lawsuit brought against Tesla following the fatal crash of Apple engineer Walter Huang in 2018.

The video, posted in October 2016 and still available on Tesla's website, begins with the caption: "The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself." We then see a Tesla Model X leave a garage, and a driver enters the car as The Rolling Stones' "Paint it Black" begins to play.

[...] At the time, Tesla CEO Elon Musk publicized the video via his Twitter account, telling the world that "Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot." Musk went on to add that "8 cameras, 12 ultrasonars and radar all flush mounted and body color. Beauty remains."

[...] But the Model X in the video was preprogrammed to drive from Menlo Park to Palo Alto, according to Elluswamy, who was a senior software engineer in 2019 before being promoted to head all Autopilot software development in 2019.

"The intent of the video was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system," Elluswamy said in his testimony, according to Reuters. 3D maps were used to pre-program the route, including where to stop, and during the self-parking demo a Tesla crashed into a fence, Elluswamy said.

The fatal crash occurred on Highway 101 in Mountain View, California, in March 2018 when Huang's Model X, operating under Autopilot, swerved into a highway crash attenuator at more than 70 mph. Tesla blamed Huang for the crash, claiming he was not paying attention. But according to the National Transportation Safety Board, Huang had repeatedly complained to friends and family about his car's propensity to swerve at that particular crash barrier in the past. The National Transportation Safety Board had harsh words for Tesla, CalTrans, and the National Highway Traffic Safety Administration, all of which shared blame for the death, it said in 2020.


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by RamiK on Saturday January 21 2023, @03:56AM (3 children)

    by RamiK (1813) on Saturday January 21 2023, @03:56AM (#1287838)

    Elluswamy said drivers could “fool the system,” making a Tesla system believe that they were paying attention based on feedback from the steering wheel when they were not. But he said he saw no safety issue with Autopilot if drivers were paying attention.

    ( https://www.reuters.com/technology/tesla-video-promoting-self-driving-was-staged-engineer-testifies-2023-01-17/ [reuters.com] )

    Self-driving removes the feedback that keeps people focused on the road so requiring them to keep their hands on the wheel and eyes on the road doesn't change the fact their reaction times significantly worsen even when they're trying to pay attention.

    And yes. It's been known for years and there's plenty of research to back that up: https://www.sciencedirect.com/science/article/pii/S2352146521000612 [sciencedirect.com] https://www.southampton.ac.uk/news/2017/01/driverless-cars.page [southampton.ac.uk]

    --
    compiling...
    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Informative=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 5, Insightful) by sjames on Saturday January 21 2023, @07:57AM (2 children)

    by sjames (2882) on Saturday January 21 2023, @07:57AM (#1287861) Journal

    The idea that the human driver can just take over when the automation gets into a sticky situation reminds me of the old silent film shorts when the steering wheel comes loose and the driver hands it to the passenger as if they can do something useful with it.

    • (Score: 2) by aafcac on Sunday January 22 2023, @01:43AM (1 child)

      by aafcac (17646) on Sunday January 22 2023, @01:43AM (#1287986)

      I think that it probably should be 15 minutes maximum at a time with s cool down in between until it's legitimately good enough to handle typical driving conditions

      • (Score: 0) by Anonymous Coward on Sunday January 22 2023, @02:49AM

        by Anonymous Coward on Sunday January 22 2023, @02:49AM (#1288005)

        Nope. Once on it should not be able to hand control back to the human driver until stopped. But it should also have an automatic penalty to the manufacturer of 100 million dollars per crash, unless they can prove that it was 100% the other driver's fault. (eg, being rear-ended while stopped at a red light).
        Don't care how minor it was, if the autopilot dings someone's bumper while parking, $100,000,000 penalty. That will make them get it right before releasing it.