Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday February 26 2020, @09:13PM   Printer-friendly
from the problems-with-his-connection dept.

Tesla Autopilot Crash Driver 'Was Playing Video Game'

BBC:

An Apple employee who died after his Tesla car hit a concrete barrier was playing a video game at the time of the crash, investigators believe.

The US National Transportation Safety Board (NTSB) said the car had been driving semi-autonomously using Tesla's Autopilot software.

Tesla instructs drivers to keep their hands on the wheel in Autopilot mode.
...
But critics say the "Autopilot" branding makes some drivers think the car is driving fully autonomously.

The NTSB said the driver had been "over-reliant" on the software.

Tesla does instruct drivers to keep their hands on the wheel when using Autopilot, and an audible warning sounds if they fail to do so.

Does the Tesla branding of "autopilot" lure drivers into driving dangerously?

Tesla Crash Likely Caused by Video Game Distraction

The NTSB has published a review of a fatal crash involving a Tesla in March 2018 that includes a set of safety recommendations.

In the NTSB press release(2/25/2020) regarding the causes for the crash, they made the following observations:

The NTSB determined the Tesla "Autopilot" system's limitations, the driver's overreliance on the "Autopilot" and the driver's distraction – likely from a cell phone game application – caused the crash. The Tesla vehicle's ineffective monitoring of driver engagement was determined to have contributed to the crash. Systemic problems with the California Department of Transportation's repair of traffic safety hardware and the California Highway Patrol's failure to "report damage to a crash attenuator led to the Tesla striking a damaged and nonoperational crash attenuator" [sic], which the NTSB said contributed to the severity of the driver's injuries.

"This tragic crash clearly demonstrates the limitations of advanced driver assistance systems available to consumers today," said NTSB Chairman Robert Sumwalt. "There is not a vehicle currently available to US consumers that is self-driving. Period. Every vehicle sold to US consumers still requires the driver to be actively engaged in the driving task, even when advanced driver assistance systems are activated. If you are selling a car with an advanced driver assistance system, you're not selling a self-driving car. If you are driving a car with an advanced driver assistance system, you don't own a self-driving car," said Sumwalt.

"In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures that, when combined, led to this tragic loss. The lessons learned from this investigation are as much about people as they are about the limitations of emerging technologies," said Sumwalt. "Crashes like this one, and thousands more that happen every year due to distraction, are why "Eliminate Distractions" remains on the NTSB's Most Wanted List of Transportation Safety Improvements," he said.

[...] the board also excoriated the National Highway Transportation Agency for providing utterly ineffectual oversight when it comes to so-called "level 2" driver assists, as well as California's highway agency CalTrans, which failed to replace a damaged crash attenuator in front of the concrete gore, which would in most likelihood have saved Huang's life.

Previously:
NTSB Releases Preliminary Report on Tesla Autopilot Crash
Tesla Crash: Model X Was In Autopilot Mode, Firm Says


Original Submission #1Original Submission #2

Related Stories

Tesla Crash: Model X Was In Autopilot Mode, Firm Says 51 comments

Tesla Model X driver dies in Mountain View crash

Submitted via IRC for Fnord666

The driver of a Tesla Model X has died following a highway crash in Mountain View, leaving a number of safety questions.

Source: https://www.engadget.com/2018/03/24/tesla-model-x-driver-dies-in-mountain-view-crash/

Tesla Crash: Model X Was In Autopilot Mode, Firm Says

In a post on its website, the electric-car maker said computer logs retrieved from the wrecked SUV show that Tesla's driver-assisting Autopilot technology was engaged and that the driver doesn't appear to have grabbed the steering wheel in the seconds before the crash.

The car's 38-year-old driver died after the vehicle hit a concrete lane divider on a Northern California freeway and caught fire. The accident happened March 23.

[...] In its Friday post, Tesla said the crashed Model X's computer logs show that the driver's hands weren't detected on the steering wheel for 6 seconds prior to the accident. It said they also show the driver had "about five seconds and 150 meters of unobstructed view of the concrete divider" before the crash but that "no action was taken."

The company cited various statistics in defending Autopilot in the post and said there's no doubt the technology makes vehicles safer than traditional cars.

"Over a year ago," the post said, "our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent. Internal data confirms that recent updates to Autopilot have improved system reliability."

"Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur," the post reads. "It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."


Original Submission #1Original Submission #2

NTSB Releases Preliminary Report on Tesla Autopilot Crash; Separately, Tesla Lays Off 3,500 44 comments

Submitted via IRC for Runaway1956

Tesla fatal crash: 'autopilot' mode sped up car before driver killed, report finds

A Tesla driving in "autopilot" mode crashed in March when the vehicle sped up and steered into a concrete barrier, according to a new report on the fatal collision, raising fresh concerns about Elon Musk's technology.

The National Transportation Safety Board (NTSB) said that four seconds before the 23 March crash on a highway in Silicon Valley, which killed Walter Huang, 38, the car stopped following the path of a vehicle in front of it. Three seconds before the impact, it sped up from 62mph to 70.8mph, and the car did not brake or steer away, the NTSB said.

[...] The NTSB report [...] has once again raised serious safety questions about the limits and performance of the autopilot technology, which is meant to assist drivers and has faced growing scrutiny from experts and regulators. Mark Fong, an attorney for Huang's family, also said the report appeared to "contradict Tesla's characterization" of the collision.

The NTSB press release includes this link to the preliminary report, for anyone inclined to read the slightly longer version of events.

The Mountain View Fire Department applied about 200 gallons of water and foam to extinguish the post-crash fire. The battery reignited five days after the crash in an impound lot and was extinguished by the San Mateo Fire Department.

Feds Open New Tesla Probe After Two Model Y Steering Wheels Come Off 17 comments

https://arstechnica.com/cars/2023/03/tesla-under-new-federal-investigation-for-steering-wheels-that-detach/

Tesla has yet another federal headache to contend with. On March 4, the National Highway Traffic Safety Administration's Office of Defects Investigation opened a preliminary investigation after two reports of Tesla Model Y steering wheels detaching in drivers' hands while driving.

NHTSA's ODI says that in both cases, the model year 2023 Model Ys each required repairs on the production line that involved removing their steering wheels. The wheels were refitted but were only held in place by friction—Tesla workers never replaced the retaining bolt that affixes the steering wheel to the steering column. In 2018, Ford had to recall more than 1.3 million vehicles after an incorrectly sized bolt resulted in a similar problem.

The ODI document states that "sudden separation occurred when the force exerted on the steering wheel overcame the resistance of the friction fit while the vehicles were in motion" and that both incidents occurred while the electric vehicles still had low mileage.

Related:
Tesla recalls all cars with FSD (full self driving) option (Elon Tweet:"Definitely. The word "recall" for an over-the-air software update is anachronistic and just flat wrong!")
Feds Open Criminal Investigation Into Tesla Autopilot Claims
NHTSA Investigation Into Telsa Autopilot Intensifies
Tesla's Radar-less Cars Investigated by NHTSA After Complaints Spike
Tesla Under Federal Investigation Over Video Games That Drivers Can Play
Tesla Must Tell NHTSA How Autopilot Sees Emergency Vehicles
NHTSA Opens Investigation into Tesla Autopilot after Crashes with Parked Emergency Vehicles
Tesla Recall is Due to Failing Flash Memory
Tesla Crash Likely Caused by Video Game Distraction
Autopilot Was Engaged In The Crash Of A Tesla Model S Into A Firetruck In LA, NTSB Says
Tesla to Update Battery Software after Recent Car Fires
Tesla Facing Criminal Probe
Former Tesla Employee's Lawyer Claims His Client Was Effectively "SWATted"
NHTSA Finishes Investigation, Declares Tesla Has No Fault in Deadly Crash
Tesla Says Autopilot System Not to Blame for Dutch Crash


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Funny) by mmh on Wednesday February 26 2020, @09:17PM (10 children)

    by mmh (721) on Wednesday February 26 2020, @09:17PM (#963090)

    Jack Thompson was right, video games do kill people!

  • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:21PM (20 children)

    by Anonymous Coward on Wednesday February 26 2020, @09:21PM (#963094)

    Recognize that the percentage of drivers who will totally ignore the warnings about autopilot, and therefore operate the system unsafely, is large enough to negate any possible safety effects of having the computer in control. Thus such systems should be disabled until they can be proven that they will never get the driver into an accident that a human would easily avoid.

    • (Score: 5, Insightful) by PartTimeZombie on Wednesday February 26 2020, @09:28PM (3 children)

      by PartTimeZombie (4827) on Wednesday February 26 2020, @09:28PM (#963105)

      I see people in non-Tesla cars every day texting while they drive, among other stupid things, so yes.

      It is hard to legislate for stupidity.

      • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @10:29PM (2 children)

        by Anonymous Coward on Wednesday February 26 2020, @10:29PM (#963179)

        True but one can do things to help not facilitate it, like give people reason to not pay attention to the road with Autopilots.

        • (Score: 2) by MostCynical on Thursday February 27 2020, @01:58AM (1 child)

          by MostCynical (2589) on Thursday February 27 2020, @01:58AM (#963275) Journal

          Or radios, or makeup, or children, or gps, or phones...
          Humans are stupid.
          We kill and maim millions every year with motor vehicles.
          Self -driving cars are already better drivers than humans, but most humans think they are far better than average drivers.

          so, stupid and delusional.

          --
          "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
          • (Score: 1, Insightful) by Anonymous Coward on Thursday February 27 2020, @03:36AM

            by Anonymous Coward on Thursday February 27 2020, @03:36AM (#963303)

            Citation needed, buddy.
            We don't have enough data to make that determination yet.

    • (Score: 1) by Ethanol-fueled on Wednesday February 26 2020, @09:29PM (6 children)

      by Ethanol-fueled (2792) on Wednesday February 26 2020, @09:29PM (#963107) Homepage

      Eh, in the meantime the Tesla badge on a car is a marker to stay the hell away and follow from a distance in another lane. Though speaking of, I have even worse news for you all: BMW is also working on their own self-driving car. Soon every day on the highways will be Death Race 2000. [wikipedia.org]

      • (Score: 1, Insightful) by Anonymous Coward on Wednesday February 26 2020, @09:35PM

        by Anonymous Coward on Wednesday February 26 2020, @09:35PM (#963115)

        This -- after the Tesla hit the (previously collapsed) barrier, two more cars followed it in and were somewhat damaged (limited injuries).

      • (Score: 1, Informative) by Anonymous Coward on Wednesday February 26 2020, @09:38PM (3 children)

        by Anonymous Coward on Wednesday February 26 2020, @09:38PM (#963119)

        Eh, in the meantime the Tesla badge on a car is a marker to stay the hell away and follow from a distance in another lane.

        Yeah, the whole highway to yourself. I play a similar trick when I talk to myself on the bus so nobody will sit next to me.

        • (Score: 2) by barbara hudson on Wednesday February 26 2020, @09:46PM (2 children)

          by barbara hudson (6443) <barbara.Jane.hudson@icloud.com> on Wednesday February 26 2020, @09:46PM (#963132) Journal

          Eh, in the meantime the Tesla badge on a car is a marker to stay the hell away and follow from a distance in another lane.

          Yeah, the whole highway to yourself. I play a similar trick when I talk to myself on the bus so nobody will sit next to me.

          Doesn't work any more - everyone is "talking to themselves" when using bluetooth on their phones. You need to ask any interlopers if the voices in your head are too loud for them. Or ask them if they want to talk about the WatchTower.

          --
          SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
          • (Score: 4, Funny) by sjames on Wednesday February 26 2020, @09:59PM (1 child)

            by sjames (2882) on Wednesday February 26 2020, @09:59PM (#963145) Journal

            Just give them a goofy grin and declare "I'm wearing new socks!".

            • (Score: 2) by barbara hudson on Wednesday February 26 2020, @10:37PM

              by barbara hudson (6443) <barbara.Jane.hudson@icloud.com> on Wednesday February 26 2020, @10:37PM (#963186) Journal

              Just give them a goofy grin and declare "I'm wearing new socks!"

              ... for extra effectiveness, say it while wearing sandals and no socks. Then say "you'll never guess WHERE I'm wearing them."

              Might be fun to make videos of people doing stuff to encourage people to get up and move.

              --
              SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
      • (Score: 1, Informative) by Anonymous Coward on Wednesday February 26 2020, @10:28PM

        by Anonymous Coward on Wednesday February 26 2020, @10:28PM (#963178)
    • (Score: 5, Informative) by Booga1 on Wednesday February 26 2020, @09:37PM (6 children)

      by Booga1 (6333) on Wednesday February 26 2020, @09:37PM (#963118)

      Well, you can rule that out in this situation. From a previous report on this(linked):

      "The crash attenuator was an SCI smart cushion attenuator system, which was previously damaged on March 12, 2018, in a single-vehicle crash involving a 2010 Toyota Prius (see figure 3). "

      That particular spot has had multiple accidents over the years and Caltrans sometimes takes weeks to repair the barrier. If I remember correctly, it's a left hand exit. That's something that's pretty rare in the US and it throws people off all the time.
      Locally, I've seen people get angry at solo drivers moving into the carpool lane to take one of the ones around here. It is not a carpool restricted exit, and the lines become dashes to allow anyone to use the exit. It doesn't help. The carpool drivers get road rage angry to the point they'll tailgate the car in front by inches just to deny the solo driver a chance to take the exit.

      I'm not a fan of autonomous driving, but at least the systems are only stupid instead of malicious.

      • (Score: 2, Informative) by Anonymous Coward on Wednesday February 26 2020, @09:50PM

        by Anonymous Coward on Wednesday February 26 2020, @09:50PM (#963136)

        Another report (can't find it just now) notes that the same Tesla had previously veered left at the same location. The driver had mentioned to friends/family that he had to intervene to keep the car in the lane at that left exit... So the driver already knew that Autopilot had trouble with that road/lane configuration. And yet, wasn't paying attention when he got there.

        20-20 hindsight -- the Tesla could have noted that driver intervention was needed at that location and turned off Autopilot (with suitable warnings) some distance in advance, forcing manual control. But no, the Autopilot did the same thing again.

        In a sense it's a little surprising that more Teslas didn't follow this one in to the same "trap", before the next software update.

      • (Score: 4, Interesting) by Snotnose on Thursday February 27 2020, @01:11AM (4 children)

        by Snotnose (1623) on Thursday February 27 2020, @01:11AM (#963256)

        Quite frankly I don't give a shit. The dead asshole was in the driver's seat playing a video game, instead of paying attention. Don't care why the road was damaged there. Don't care that it was a known issue for Tesla's auto-drive. Don't care that Caltrans took their own sweet time fixing something that got broken in an earlier accident.

        Had this asshole been watching the road he would be alive today to tweet about Tesla's bad auto pilot. Instead, it was more important to him to farm corn in farmville, or whatever.

        Fuck him. Better he kills himself running into concrete than killing a family tooling along in their SUV, with the driver actually, ya'know, driving.

        --
        When the dust settled America realized it was saved by a porn star.
        • (Score: 2) by Booga1 on Thursday February 27 2020, @01:34AM (3 children)

          by Booga1 (6333) on Thursday February 27 2020, @01:34AM (#963266)

          The crash may very well have taken others out had things turned out a little differently. This was a multi-car accident.

          From the article:

          The 38-year-old driver of the 2017 Tesla Model X P100D electric-powered sport utility vehicle died from multiple blunt-force injuries after his SUV entered the gore area of the US-101 and State Route 85 exit ramp and struck a damaged and nonoperational crash attenuator at a speed of 70.8 mph. The Tesla was then struck by two other vehicles, resulting in the injury of one other person. The Tesla’s high-voltage battery was breached in the collision and a post-crash fire ensued. Witnesses removed the Tesla driver from the vehicle before it was engulfed in flames.

          • (Score: 4, Informative) by Booga1 on Thursday February 27 2020, @01:38AM (2 children)

            by Booga1 (6333) on Thursday February 27 2020, @01:38AM (#963267)

            The photo from the article [ntsb.gov] really puts it into perspective how close it could have been to a multiple fatality accident.

            • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @04:10PM (1 child)

              by Anonymous Coward on Thursday February 27 2020, @04:10PM (#963557)

              thanks for link.
              i was just thinking why that concret block is square and not sloped.
              my guess is that a sloped concret block (front side to enemy) would FLIP a car beautifully.
              a flipped car doesn't travel very far and will stop from friction. furthermore, once flipped there's still enough "crumble zone" to protect passenger(s).
              anyways, if you want to make the divider even more efficient at killing people, i recommend forming a big honking iron into a vertical cleave ^_^

              • (Score: 2) by Booga1 on Thursday February 27 2020, @04:28PM

                by Booga1 (6333) on Thursday February 27 2020, @04:28PM (#963572)

                Can't pick between a solid block and an angled ramp? I see you're willing to split the difference!

    • (Score: 2) by sjames on Wednesday February 26 2020, @09:57PM

      by sjames (2882) on Wednesday February 26 2020, @09:57PM (#963144) Journal

      Or recognize that sometimes with or without autopilot, people will do dumb things against all warnings and advice. Do we disable all cars or do we give appropriate warnings and hope they don't do something stupid? (such as playing a video game while driving in spite of audible and visual warnings to pay attention to the road)

    • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @10:23PM

      by Anonymous Coward on Wednesday February 26 2020, @10:23PM (#963170)

      You mean like this guy [cnet.com]?

  • (Score: 2) by Gaaark on Wednesday February 26 2020, @09:27PM (2 children)

    by Gaaark (41) on Wednesday February 26 2020, @09:27PM (#963102) Journal

    Take them by the hand, and they'll rip it away and run in front of the bus.

    Stupid is as....

    --
    --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
    • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:33PM (1 child)

      by Anonymous Coward on Wednesday February 26 2020, @09:33PM (#963111)

      Corollary?
      Apple hires stupid people for game development? I think there is more to it than just that.

      For example, it's rare to really internalize risk properly. The highway system may go a really long way (million miles?) between accidents. But if a user of Autopilot (or other assist system) sees that it's doing ok after a few weeks, or a few months, they start to assume it's as good as the system.

      Another possibility is the built-in addictive ability of gaming, it's so good that even a game programmer is sucked in and needs that constant fix.
         

  • (Score: 2, Informative) by Anonymous Coward on Wednesday February 26 2020, @09:27PM

    by Anonymous Coward on Wednesday February 26 2020, @09:27PM (#963103)

    This news page has a photo of the barrier that was previously hit (that driver was more-or-less OK), but the barrier wasn't repaired/reset in time for the Tesla,
        https://ktla.com/news/local-news/ntsb-says-california-officials-failed-to-fix-highway-barrier-before-deadly-tesla-crash-in-bay-area/ [ktla.com]
    Article was posted last fall.

  • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:28PM (2 children)

    by Anonymous Coward on Wednesday February 26 2020, @09:28PM (#963104)

    I thought this was going to be about a Tesla crashing and distracting someone from their videogame, so they lost on the last level or something.

    • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:38PM

      by Anonymous Coward on Wednesday February 26 2020, @09:38PM (#963121)

      Oh, he lost all right. Big time, the whole enchilada.

      Not sure if the phone logs revealed what game level he was on before the accident. Depending on your religious beliefs, he may be on to the next level now.

    • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @09:48PM

      by Anonymous Coward on Wednesday February 26 2020, @09:48PM (#963133)

      My read was “running video game on Tesla’s computer. And crashed crashed the car”

      Would make a great gaming platform. But only in PARK

  • (Score: 1, Insightful) by Anonymous Coward on Wednesday February 26 2020, @09:41PM (15 children)

    by Anonymous Coward on Wednesday February 26 2020, @09:41PM (#963126)

    The driver expects the car to drive itself. The car can't actually drive itself safely, and requires babysitting.
    It's a terrible feature as it stands. You might as well just drive if the car instead if you still have to look over and try and deal with whatever the hell the car is doing.

    • (Score: 3, Insightful) by sjames on Wednesday February 26 2020, @10:09PM

      by sjames (2882) on Wednesday February 26 2020, @10:09PM (#963156) Journal

      The car constantly bongs and warns him that he needs to pay attention and he ignores it...

    • (Score: 5, Interesting) by KilroySmith on Wednesday February 26 2020, @10:18PM (2 children)

      by KilroySmith (2113) on Wednesday February 26 2020, @10:18PM (#963167)

      Actually, it's an excellent feature if used intelligently. Sadly, Mr. Huang, despite being an engineer, was not using the system intelligently. IMHO, continuing to use AP as he did despite having multiple instances of it working badly indicates to me that he had a blatant disregard for his own safety.

      I have a Model 3, and use AP extensively. I've also noted that it's close to being able to drive autonomously on freeways, but it's not there yet. It occasionally does rude things, occasionally does stupid things, although I haven't noted it doing dangerous things yet - but I read the news and know of the kinds of dangerous things it can do. As a result, I drive with my hand on the wheel and my eyes outside the car. It makes driving less stressful once you realize that it's a driver assist and not a driver replacement; I'm operating at a more executive level of watching what's going on around me rather than having to pay attention to whether I'm centered in the lane, driving an appropriate speed, or following at a safe distance - the car is excellent at those tasks.

      Maybe this situation shows why we can't have nice things, why nothing should be sold unless it's idiot proof. Unfortunately, the options are currently less savory - the September 11, 2001 terror attacks in the USA that forever changed our society killed fewer people than vehicles do in a normal month in the USA. Autonomous driving systems, even flawed ones, can drastically reduce those deaths and injuries.

      • (Score: 2, Insightful) by Anonymous Coward on Thursday February 27 2020, @03:32AM (1 child)

        by Anonymous Coward on Thursday February 27 2020, @03:32AM (#963302)

        Jesus, driving isn't that hard, especially if you have an automatic transmission, which everybody in the US does (all except 5 people).
        You can drive while thinking of other things, even. You are telling me you need an "assist" in the form of the criminally misnamed Auto Pilot to help you drive? You are going senile if that is the case. Look, Tesla's Auto Pilot is the worst possible implementation: you have to pay attention and have hands ready as if you were driving, while not driving, in case on a second's notice you HAVE to drive. That's more cognitive load and context switching than simply driving! At least if you plan on not crashing...

    • (Score: 5, Interesting) by darkfeline on Wednesday February 26 2020, @11:27PM (10 children)

      by darkfeline (1030) on Wednesday February 26 2020, @11:27PM (#963212) Homepage

      It's quite interesting where this misconception about autopilot could have come from.

      Do passengers expect that airplane autopilot means that their human pilot gets to take a nap or play hanky panky? Why then do they expect to be able to do the same with car autopilot?

      Maybe it's because of the marketing, but I don't know anything about that since I've never seen a Tesla ad.

      --
      Join the SDF Public Access UNIX System today!
      • (Score: 0, Insightful) by Anonymous Coward on Thursday February 27 2020, @12:50AM (5 children)

        by Anonymous Coward on Thursday February 27 2020, @12:50AM (#963250)

        Apparently, your English isn't very good. Autopilot literally means selfpilot. The fact that the aviation industry has a different definition doesn't change the meaning of the components of the word. That's what normal people expect out of an autopilot, the ability to drive a segment without intervention.

        • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @07:04AM (2 children)

          by Anonymous Coward on Thursday February 27 2020, @07:04AM (#963350)

          So does that mean automobiles will normally just start moving by themselves? ;)

          Without intervention does not mean you can safely play a video game or go to sleep or leave the pilot seat while it's doing so.

          See also: https://en.wikipedia.org/wiki/Autopilot#First_autopilots [wikipedia.org]

          • (Score: 2) by maxwell demon on Thursday February 27 2020, @07:08PM

            by maxwell demon (1608) on Thursday February 27 2020, @07:08PM (#963663) Journal

            So does that mean automobiles will normally just start moving by themselves? ;)

            Automobiles move themselves (before, you had to have a horse doing the moving). There is no "starting" in "automobile" (sorry, I'm too lazy to find out what "starting" is in Latin).

            --
            The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @08:34PM

            by Anonymous Coward on Thursday February 27 2020, @08:34PM (#963725)

            As the other poster said, automobiles move themselves once started and put into gear. The steering and specific speed are controlled by the driver. Auto means self, mobile means move, so they are self moving.

            If you have to count on overriding the common use of a word in reference to a mass market product, bad things are likely to happen. It's one of the reasons for weird invented terms for features.

        • (Score: 2) by darkfeline on Thursday February 27 2020, @09:52PM

          by darkfeline (1030) on Thursday February 27 2020, @09:52PM (#963795) Homepage

          Autopilot does have the ability to drive a segment without intervention, right into the concrete barrier.

          Human pilots make mistakes too, I don't see why selfpilots are exempt from that.

          --
          Join the SDF Public Access UNIX System today!
        • (Score: 2) by sjames on Friday February 28 2020, @08:07PM

          by sjames (2882) on Friday February 28 2020, @08:07PM (#964307) Journal

          And hot water heater means a device that further heats already hot water but people routinely supply it with cold water because they know what is actually meant.

      • (Score: 2) by maxwell demon on Thursday February 27 2020, @01:43PM (2 children)

        by maxwell demon (1608) on Thursday February 27 2020, @01:43PM (#963463) Journal

        Do passengers expect that airplane autopilot means that their human pilot gets to take a nap or play hanky panky?

        Yes. Well, not exactly, they expect the pilots to still handle things like communications with towers, but they definitely wouldn't expect that the pilot has to care much about what the plane does.

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @05:21PM (1 child)

          by Anonymous Coward on Thursday February 27 2020, @05:21PM (#963618)

          You have the advantage in the air that everyone else is following a track as well, and there aren't any trees, gas stations, or pedestrians up at 40,000 feet. You tend to just hit air, which is kind of soft.

          • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @08:36PM

            by Anonymous Coward on Thursday February 27 2020, @08:36PM (#963727)

            Then, don't call it autopilot until it can self pilot reliably. Problem solved.

      • (Score: 2) by sjames on Friday February 28 2020, @08:22PM

        by sjames (2882) on Friday February 28 2020, @08:22PM (#964311) Journal

        The Tesla autopilot actually does more than a jumbo jet's autopilot does. The aircraft version doesn't include collision avoidance, that's the pilot's job.

  • (Score: 1, Funny) by Anonymous Coward on Wednesday February 26 2020, @09:43PM (2 children)

    by Anonymous Coward on Wednesday February 26 2020, @09:43PM (#963129)

    And Teslas response... He wasn't holding it right

    • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @10:22PM (1 child)

      by Anonymous Coward on Wednesday February 26 2020, @10:22PM (#963168)

      Has it gotten to the point where "holding it wrong" is more important for the phone than the dick?

      • (Score: 0) by Anonymous Coward on Wednesday February 26 2020, @11:34PM

        by Anonymous Coward on Wednesday February 26 2020, @11:34PM (#963215)

        I would guess iPhone users stroke both the same way.

  • (Score: 1) by bobmorning on Wednesday February 26 2020, @10:53PM (2 children)

    by bobmorning (6045) on Wednesday February 26 2020, @10:53PM (#963196)

    The gene pool just got a slight improvement.

    • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @02:35AM

      by Anonymous Coward on Thursday February 27 2020, @02:35AM (#963285)

      That is some dumb shit, weren't you defending Mike the Rocket Man?

    • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @03:41AM

      by Anonymous Coward on Thursday February 27 2020, @03:41AM (#963305)

      But the automotive technology gene pool didn't get any better. This shit is faulty and does not deliver as promised, but people will keep buying it. A crash will not stop it from reproducing.

  • (Score: 0) by Anonymous Coward on Thursday February 27 2020, @01:27AM

    by Anonymous Coward on Thursday February 27 2020, @01:27AM (#963261)

    is that it didn't make a normal crash sound but that pacman sound when you lose a life.

  • (Score: 2) by maxwell demon on Thursday February 27 2020, @01:46PM

    by maxwell demon (1608) on Thursday February 27 2020, @01:46PM (#963464) Journal

    An autopilot should never get distracted by a video game! :-)

    --
    The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 2) by Phoenix666 on Thursday February 27 2020, @09:52PM (1 child)

    by Phoenix666 (552) on Thursday February 27 2020, @09:52PM (#963791) Journal

    Wait, was he playing Frogger? Because that would be choice.

    --
    Washington DC delenda est.
    • (Score: 2) by sjames on Friday February 28 2020, @08:25PM

      by sjames (2882) on Friday February 28 2020, @08:25PM (#964312) Journal

      I have often thoughty that the audible crosswalk signals should use the Frogger SPLAT sound to signal don't walk.

(1)