Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 6 submissions in the queue.
posted by janrinok on Monday April 02 2018, @01:27PM   Printer-friendly
from the I'll-wait-until-the-bugs-are-ironed-out dept.

Tesla Model X driver dies in Mountain View crash

Submitted via IRC for Fnord666

The driver of a Tesla Model X has died following a highway crash in Mountain View, leaving a number of safety questions.

Source: https://www.engadget.com/2018/03/24/tesla-model-x-driver-dies-in-mountain-view-crash/

Tesla Crash: Model X Was In Autopilot Mode, Firm Says

In a post on its website, the electric-car maker said computer logs retrieved from the wrecked SUV show that Tesla's driver-assisting Autopilot technology was engaged and that the driver doesn't appear to have grabbed the steering wheel in the seconds before the crash.

The car's 38-year-old driver died after the vehicle hit a concrete lane divider on a Northern California freeway and caught fire. The accident happened March 23.

[...] In its Friday post, Tesla said the crashed Model X's computer logs show that the driver's hands weren't detected on the steering wheel for 6 seconds prior to the accident. It said they also show the driver had "about five seconds and 150 meters of unobstructed view of the concrete divider" before the crash but that "no action was taken."

The company cited various statistics in defending Autopilot in the post and said there's no doubt the technology makes vehicles safer than traditional cars.

"Over a year ago," the post said, "our first iteration of Autopilot was found by the US government to reduce crash rates by as much as 40 percent. Internal data confirms that recent updates to Autopilot have improved system reliability."

"Tesla Autopilot does not prevent all accidents -- such a standard would be impossible -- but it makes them much less likely to occur," the post reads. "It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists."


Original Submission #1Original Submission #2

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Informative) by LoRdTAW on Monday April 02 2018, @01:58PM (6 children)

    by LoRdTAW (3755) on Monday April 02 2018, @01:58PM (#661452) Journal

    And this genius provides a demonstration:
    https://electrek.co/2018/04/02/tesla-fatal-autopilot-crash-recreation/ [electrek.co]

    • (Score: 1, Interesting) by Anonymous Coward on Monday April 02 2018, @02:35PM (2 children)

      by Anonymous Coward on Monday April 02 2018, @02:35PM (#661480)

      Thanks for that electrek link, it's about what I expected given the clues in various other reports. On the Tesla page it mentions the crushed crash attenuator, crushed previous to this accident(??) In that case, will the highway dept share liability for not fixing it promptly? The attenuator might be a stack of the common yellow barrels (Fitch barrier--invented by WWII fighter pilot and race car driver John Fitch) or some other design. The Fitch design is clever, the ballast (sand or water) is held up off the ground, so that the center of mass of the barrier is similar height to CG height of a car.

      Is this scenerio similar to the Tesla that ran into the stopped emergency vehicle not long ago? Something stopped in the lane seems to be filtered out or ignored by the video system.

      • (Score: 2, Informative) by tftp on Monday April 02 2018, @09:12PM (1 child)

        by tftp (806) on Monday April 02 2018, @09:12PM (#661664) Homepage

        No, in that case it was a different bug. Autopilot was following a car, but then the leading car left the lane (since its driver saw the fire truck ahead.) The autopilot detected that there is no car ahead anymore and accelerated the tesla to the preset speed, with which it proceeded to hit the fire truck.

        The warning about this behavior was prominently printed on the page 358 of the manual, second paragraph from the bottom. The driver is, of course, expected to know all the listed bugs and be ready to counter them at any time.

        If you ask me, I'd rather drive the car myself, it is less tiring and much safer. Can't imagine how it is to ride in a seemingly friendly car that loves to kill you as soon as it sees an opportunity.

        • (Score: 2) by LoRdTAW on Monday April 02 2018, @11:43PM

          by LoRdTAW (3755) on Monday April 02 2018, @11:43PM (#661722) Journal

          I wonder if there is a way to permanently disable it in case one of those bugs includes self-awareness, or at the very least, malfunction.

    • (Score: 3, Insightful) by Nuke on Monday April 02 2018, @10:10PM (2 children)

      by Nuke (3162) on Monday April 02 2018, @10:10PM (#661693)

      The Electrek link video tells me all I need to know about SD cars at the present time. If they can be fooled by so elementary a situation then they have a long long way to go. If they can fuck up on a wide, open, well lit, signed and marked bit of road like that, I hate to think how they would manage on the roads around me in a rural part of the UK for example.

      I wonder when the shills will stop claiming they are safer than an average human driver, unless the drivers where they live are very bad indeed. Nevertheless the guys seen driving ahead in that video managed to pass that point without killing themselves. Whatever the crash statistics of SD cars are, they are not as good as mine because they have crashed and I have not. Small sample, but SD cars so far are themselves a small sample, and mostly with test drivers aboard being more alert than the "average" driver would be. Wait until the latter start using them. Cases like this show what happens when the driver is not ready to intervene all the time and so he needs to be just as keyed up and constantly making decisions as if driving himself anyway, leaving us wondering what the point is.

      • (Score: 3, Interesting) by LoRdTAW on Monday April 02 2018, @11:55PM

        by LoRdTAW (3755) on Monday April 02 2018, @11:55PM (#661724) Journal

        I personally believe the so-called shills for antonymous cars are people who have no real understanding of computers and how incredibly complex these problems are to solve but pretend to understand. Sure we have computer vision and there are cool demos but imagine all of those demos running at once and algorithms deciding whats a person, the road, a sign, and an near infinite amount of objects and patterns.

        Putting these barely tested time bombs on the road is another great demonstration of mans hubris with regard to technological advancement. Time to admit we DO NOT have safe antonymous vehicles on the road. We have a lot more work to do to prove otherwise.

      • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @08:02AM

        by Anonymous Coward on Tuesday April 03 2018, @08:02AM (#661856)

        and mostly with test drivers aboard being more alert than the "average" driver would be

        Even the Uber test driver in the car that killed a pedestrian recently looked up from the phone about as often as alert as an average cell phone using driver in a *non self driving* car.

        Just wait until they start putting people who are used to texting and driving into self driving cars... They won't be looking up in time to realize they are about to hit someone and it's too late to brake, like in that Uber video. They will be looking in the mirror thinking "what was that bump".

  • (Score: 2) by takyon on Monday April 02 2018, @02:18PM (16 children)

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday April 02 2018, @02:18PM (#661464) Journal

    Investors not willing to put up with Tesla's shit for much longer?

    https://www.thestreet.com/story/14540154/1/tesla-now-looks-like-a-show-me-story.html [thestreet.com]

    Keep in mind (before someone labels this part of a womyn-born womyn conspiracy to destroy Tesla/SpaceX) that the Musky one himself has admitted [arstechnica.com] that "At the beginning I thought Tesla and SpaceX maybe had a 10 percent chance at success". It's just that the company is still in a vulnerable pupal state, with low production, recalls, mounting electric car competition, and other issues.

    I don't think these crash incidents matter too much though. "Autopilot" isn't "fully autonomous", and other car manufacturers have similar autonomy features (ability to keep in lane, not rear end car in front, etc.). Other cars are crashing without making national news.

    A Tesla update could make Autopilot a little safer: https://www.engadget.com/2018/04/02/tesla-model-3-autopilot-steering-wheel/ [engadget.com]

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 1, Insightful) by Anonymous Coward on Monday April 02 2018, @02:23PM (1 child)

      by Anonymous Coward on Monday April 02 2018, @02:23PM (#661470)

      Look both ways before crossing; the laws of man do not trump the laws of physics.

      • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @08:52AM

        by Anonymous Coward on Tuesday April 03 2018, @08:52AM (#661867)

        So now we need self driving concrete barriers that can look both ways to avoid getting hit by self driving cars... Did you even read the summary, or do you think this is still the Uber story?

    • (Score: 1) by khallow on Monday April 02 2018, @02:24PM

      by khallow (3766) Subscriber Badge on Monday April 02 2018, @02:24PM (#661472) Journal

      Investors not willing to put up with Tesla's shit for much longer?

      Too late for that IMHO, but they're welcome to sell at a loss (though probably less of a loss than if they hold on). The first video mentions bond holders. Those guys will be needed to buy more bonds and fuel further growth. Those are the people Tesla needs to worry about.

    • (Score: 3, Insightful) by VLM on Monday April 02 2018, @02:28PM (9 children)

      by VLM (445) on Monday April 02 2018, @02:28PM (#661475)

      mounting electric car competition

      Its interesting if they dropped the self driving gimmick they would be a more successful car company.

      Oldest startup rule in the book, never try to "innovate" too many things at the same time, so naturally they "self drive" and "electric car" and then crash (pun?). Just one or the other probably would have worked out.

      • (Score: 2) by VLM on Monday April 02 2018, @02:31PM (8 children)

        by VLM (445) on Monday April 02 2018, @02:31PM (#661476)

        Oh and as a followup, VLMs law of shitty journalism (but I repeat myself, LOL) means failure of the company due to EV issues or self driving issues means shitty journalists will use it as proof the other side, the one that actually worked, is unworkable in the marketplace.

        So if self driving crashes kill the company as now seems likely, there will be shitty journalists pushing the meme that EV is a dead end. And if EV battery problems kill the company, as seems much less likely, there will be shitty journalists pushing the meme that self driving crashes kill companies.

        • (Score: -1, Offtopic) by Anonymous Coward on Monday April 02 2018, @02:38PM (1 child)

          by Anonymous Coward on Monday April 02 2018, @02:38PM (#661482)

          If this junk website offered editing, like was discussed earlier, then maybe we could avoid this nonsensical self-replying.

          • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @06:25AM

            by Anonymous Coward on Tuesday April 03 2018, @06:25AM (#661837)

            I for one like the way it works now.

            Sure I do make many a stupid typos that I would like to fix but if you could edit your post you could change much more than a minor typo, making following the debate and reasoning very difficult and people saying they agree with somebody (A) when the edited post now talks about something else (B). Also this makes people more likely to pay attention to what they post in order to avoid stupid typos. I'm sure many will also cancel posting worthless drivel too because of this double take. While replying to yourself might be considered not elegant I think it's a minor price to pay here.

        • (Score: 4, Informative) by takyon on Monday April 02 2018, @02:48PM (1 child)

          by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday April 02 2018, @02:48PM (#661490) Journal

          there will be shitty journalists pushing the meme that EV is a dead end.

          "EV is a dead end" is a dead meme.

          Volvo Aiming to Sell One Million Electric Vehicles by 2025 [soylentnews.org]
          Ford to Invest $11 Billion in Electric Vehicles and Produce 40 Hybrid and Electric Models by 2022 [soylentnews.org]

          You can probably find more.

          And here's a fun one:

          U.S. Utilities Look To Electric Cars As Their Savior Amid Decline In Demand [npr.org]

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by HiThere on Monday April 02 2018, @05:40PM

            by HiThere (866) Subscriber Badge on Monday April 02 2018, @05:40PM (#661583) Journal

            It's not a dead end...unless they fail to solve the battery supplies problem. I've seen a couple of pieces about not processes for extracting Lithium, and a couple about new technologies based on Carbon, but they need to come up with *SOME* answer or the EVs are going to be quite limited in number. And processes still in the lab won't do the job. (But they're clearly working on the problem.)

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: -1, Redundant) by Anonymous Coward on Monday April 02 2018, @02:49PM (1 child)

          by Anonymous Coward on Monday April 02 2018, @02:49PM (#661492)

          If this junk website offered editing, like was discussed earlier, then maybe we could avoid this nonsensical self-replying.

          • (Score: 0) by Anonymous Coward on Monday April 02 2018, @02:51PM

            by Anonymous Coward on Monday April 02 2018, @02:51PM (#661497)

            You are a junk user. Too bad we can't delete your posts.

        • (Score: -1, Redundant) by Anonymous Coward on Monday April 02 2018, @05:03PM

          by Anonymous Coward on Monday April 02 2018, @05:03PM (#661577)

          If this junk website offered editing, like was discussed earlier, then maybe we could avoid this nonsensical self-replying.

        • (Score: -1, Redundant) by Anonymous Coward on Monday April 02 2018, @05:49PM

          by Anonymous Coward on Monday April 02 2018, @05:49PM (#661586)

          If this junk website offered editing, like was discussed earlier, then maybe we could avoid this nonsensical self-replying.

    • (Score: 2) by Weasley on Monday April 02 2018, @02:43PM (2 children)

      by Weasley (6421) on Monday April 02 2018, @02:43PM (#661485)

      The notable thing about Teslas is not that they're autonomous cars, they're electric sports cars. The autopilot is just garnish.

      And is that website a joke? It looks like they shot that video with a handheld camcorder (maybe even a cell phone) using the on camera mic with amateurish financial commentary delivered in the urgent manner that's popular right now.

  • (Score: 4, Insightful) by VLM on Monday April 02 2018, @02:36PM (4 children)

    by VLM (445) on Monday April 02 2018, @02:36PM (#661481)

    It said they also show the driver had "about five seconds and 150 meters of unobstructed view of the concrete divider" before the crash but that "no action was taken."

    Lack of cooperation. Pilots and copilots have developed, over a century and over hundreds if not thousands of deaths, a fairly robust protocol to exchange control of a large aircraft.

    With self driving cars its a million times more lame, there appears to be nothing more to the protocol than "the pilot wasn't touching the controls at the time the of the copilots CFIT so we'll close out the investigation with 100% blame of the pilot". I'm not sure this is a fixable "feature" of self driving cars, its not going to be possible to trade control without killing people. Either we'll have cars with no controls over than voice destination selection or manual cars, nothing in between can work without lots of deaths.

    • (Score: -1, Flamebait) by Anonymous Coward on Monday April 02 2018, @02:42PM (1 child)

      by Anonymous Coward on Monday April 02 2018, @02:42PM (#661484)

      That would still be a win. In fact, that's what Tesla has been saying, much to the horror of the Leftist media, which has proven again and again to be incapable of forming logical deductions, especially with regard to statistics.

      • (Score: 2) by tonyPick on Tuesday April 03 2018, @08:48AM

        by tonyPick (1237) on Tuesday April 03 2018, @08:48AM (#661865) Homepage Journal

        So good you posted it three times?

        Anyhow: Tesla is being deliberately misleading with their statements here - they're comparison is to "all accidents over all groups of drivers in all conditions" when they are in fact covering a relatively low-accident subset of the driving population (luxury car owners) and also only handling the safest of driving conditions (highway driving, clear conditions).

        One of the Ars commenters ran the numbers, and Tesla safety is worse, even against a baseline of Luxury Vehicles:
        https://arstechnica.com/cars/2018/03/tesla-says-autopilot-was-active-during-fatal-crash-in-mountain-view/?comments=1&post=35079915 [arstechnica.com]

        TDLR;
        Tesla claim "one fatality every 320 million miles" (which isn't counting the crashes where the records aren't recoverable, so it's a best case).
        Being very charitable to Tesla when looking at the stats then the worst of the comparable cars was the Lexus ES330, with one fatality every 417 million miles

        And that's being very generous to Tesla, because it doesn't account for the fact Tesla AP isn't functional in any of the conditions that make accidents more likely (snow, heavy rain or fog), and that's for 2002 era safety systems, so conventional cars will do even better.

    • (Score: -1, Redundant) by Anonymous Coward on Monday April 02 2018, @05:08PM

      by Anonymous Coward on Monday April 02 2018, @05:08PM (#661579)

      That would still be a win. In fact, that's what Tesla has been saying, much to the horror of the Leftist media, which has proven again and again to be incapable of forming logical deductions, especially with regard to statistics.

    • (Score: 0) by Anonymous Coward on Monday April 02 2018, @05:54PM

      by Anonymous Coward on Monday April 02 2018, @05:54PM (#661588)

      That would still be a win. In fact, that's what Tesla has been saying, much to the horror of the Leftist media, which has proven again and again to be incapable of forming logical deductions, especially with regard to statistics.

  • (Score: 3, Interesting) by VLM on Monday April 02 2018, @02:48PM (12 children)

    by VLM (445) on Monday April 02 2018, @02:48PM (#661489)

    38-year-old driver died

    The reason why people drunk drive regardless of level of draconian punishment is partially alcoholism, but mostly because the average drunk driver has less than 1 in 10000 chance of getting picked up any given night, the odds of getting home uncaught are extremely high.

    Superficially you'd predict the victims of draconian punishment would trend toward young and stupid, partially because they'd get weeded out early, partially because they're young and stupid and thus more likely to do stupid things. However drunk drivers are mostly a cross section of humanity because the odds of getting caught are so low that being young and stupid isn't a significant handicap WRT avoiding capture.

    An analogy from Africa is 1% of wildebeasts get eaten by lions per year and 1% of wildebeasts are sick or ill or hurt, the odds of a lazy lion eating a sick wildebeast are pretty high near 100%. But poachers pick off a random 0.001% of wildebeasts with a scoped rifle, and the odds of a dead wildebeast being diseased are the population distribution of a mere 1%, not the near 100% of lion captured prey.

    So my point is in an orderly system by 38 most of the stupid is filtered out, so if this was truly a rare problem caused by driver error the victim would much more likely be young and stupid, like 18. However, this victim being 38 would imply the total number of failures is very high but the system is only getting caught in fatal crashes very rarely. So the self driving system is likely failing 10x per day, but most of the time it fails is boring straightaways in light traffic and nobody notices. The equivalent of the /var/log/syslog must be fascinating to read for a self driving car.

    In summary the age implies this is not a driver error situation despite the corporate press release claim.

    • (Score: 2, Interesting) by Anonymous Coward on Monday April 02 2018, @03:07PM (6 children)

      by Anonymous Coward on Monday April 02 2018, @03:07PM (#661504)

      Maybe people have a habit of identifying the wrong problems and then ramping of the hysteria when their "solutions" don't change much.

      The problem with drunk driving is not that a person is drunk; rather, it's that a large number of people can't stay awake under the influence of even a little alcohol.

      That's why so many of these crashes involve a drunk driver crossing a median, or ramming into something at an incredible speed. They are asleep.

      Incidentally, that's also why "driving under the influence of sleep deprivation" is reported every now and then as being just as bad as driving under the influence of alcohol—if the police only had a test to determine whether you had adequate sleep, they'd use it! Cue the founding of MADD: Mothers Against Dreamy Driving.

      Government statistics show that only about 1/3 of traffic-related deaths are alcohol-related—whatever that means; maybe, the pilot was the sober "designated driver", ferrying his drunk buddies home when they crashed? Possibly, police found a spent beer bottle in the car from last week's tailgating party? Perhaps, the driver had one Miller Light during a heavy dinner?

      Even if alcohol were eliminated from the world, that would still leave 2/3 of traffic-related deaths ongoing. Yet, you'd think from the rhetoric that it's the sole cause of problems on the road.

      • (Score: 2) by tangomargarine on Monday April 02 2018, @03:15PM (5 children)

        by tangomargarine (667) on Monday April 02 2018, @03:15PM (#661507)

        The problem with drunk driving is not that a person is drunk; rather, it's that a large number of people can't stay awake under the influence of even a little alcohol.

        Well, alcohol also slows down your reactions. So it's not just that a person is drunk.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 0) by Anonymous Coward on Monday April 02 2018, @05:06PM (4 children)

          by Anonymous Coward on Monday April 02 2018, @05:06PM (#661578)

          If you're driving at safe distances, and at the posted speed limit, etc., then you don't need very fast reaction times.

          That's why old people can still drive, and why measurably dumb people are allowed to drive, etc.

          Come on. Surely, you've had a few beers—which is considered very illegal if you then go driving. You KNOW it's not as dangerous as it's made out to be. You KNOW it.

          • (Score: 1) by Sulla on Monday April 02 2018, @06:39PM

            by Sulla (5173) on Monday April 02 2018, @06:39PM (#661612) Journal

            I don't know about you but I won't even consider driving if I have any external factors effecting my cognition, reaction time, or vision.

            --
            Ceterum censeo Sinae esse delendam
          • (Score: 2) by tangomargarine on Monday April 02 2018, @06:44PM (2 children)

            by tangomargarine (667) on Monday April 02 2018, @06:44PM (#661615)

            Come on. Surely, you've had a few beers—which is considered very illegal if you then go driving.

            According to these [charlotteagenda.com] people who experimented, you can drink 2 or 3 craft beers and still be slightly under the limit (of course that's rather dangerous since breathalyzers are notoriously inaccurate). Which is like the equivalent of 6 Miller Lites?

            You KNOW it's not as dangerous as it's made out to be. You KNOW it.

            I'm not making any claims about how dangerous it is. I'm just saying the tiredness isn't the only part of the equation.

            --
            "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
            • (Score: -1, Redundant) by Anonymous Coward on Monday April 02 2018, @08:27PM (1 child)

              by Anonymous Coward on Monday April 02 2018, @08:27PM (#661654)

              If you're over the limit, then it's DUI by definition.

              However, the State can choose to press DUI charges for any amount.

              • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @06:29AM

                by Anonymous Coward on Tuesday April 03 2018, @06:29AM (#661839)

                [citation needed]

    • (Score: 2) by tangomargarine on Monday April 02 2018, @03:13PM (1 child)

      by tangomargarine (667) on Monday April 02 2018, @03:13PM (#661505)

      So my point is in an orderly system by 38 most of the stupid is filtered out, so if this was truly a rare problem caused by driver error the victim would much more likely be young and stupid, like 18. However, this victim being 38 would imply the total number of failures is very high but the system is only getting caught in fatal crashes very rarely. So the self driving system is likely failing 10x per day, but most of the time it fails is boring straightaways in light traffic and nobody notices. The equivalent of the /var/log/syslog must be fascinating to read for a self driving car.

      In summary the age implies this is not a driver error situation despite the corporate press release claim.

      I think you're wildly leaping to conclusions here. A single driver over 18 dying means there must be 10 failures every day? WTF no.

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: -1, Flamebait) by Anonymous Coward on Monday April 02 2018, @04:46PM

        by Anonymous Coward on Monday April 02 2018, @04:46PM (#661567)

        Stupid has no age limits.

    • (Score: 2, Funny) by Anonymous Coward on Monday April 02 2018, @03:58PM (2 children)

      by Anonymous Coward on Monday April 02 2018, @03:58PM (#661535)

      Other news sites have car analogies...

      On SN we have wildebeast analogies!

      • (Score: 2) by bob_super on Monday April 02 2018, @07:10PM (1 child)

        by bob_super (1357) on Monday April 02 2018, @07:10PM (#661626)

        Could have gone with Tauntaun, at least...

        • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @06:31AM

          by Anonymous Coward on Tuesday April 03 2018, @06:31AM (#661840)

          1) It's a brave GNU world
          2) Have you smelled the insides of those things?!

  • (Score: 4, Interesting) by ilsa on Monday April 02 2018, @05:00PM (6 children)

    by ilsa (6082) Subscriber Badge on Monday April 02 2018, @05:00PM (#661575)

    I think these problems all stem from a badly chosen name.

    People see 'autopilot' and think that they suddenly driving KITT and no longer need to pay attention or take responsibility for their driving. The technology just isn't there yet. Period.

    Tesla needs to bite the bullet and accept that people are too damned lazy to read the instruction manual for such a critical feature and understand the limitations and requirements. That means changing the name to something less Sci-Fi-y and more accurate to the features that it provides.

    Something like "Driver Assist" is a less sexy sounding name, but would do a much better job of setting expectations.

    • (Score: 2) by bob_super on Monday April 02 2018, @07:07PM

      by bob_super (1357) on Monday April 02 2018, @07:07PM (#661625)

      In the case of the video in the first post, "Lane following" would be the exact term to use.

    • (Score: 2) by darkfeline on Tuesday April 03 2018, @03:06AM (1 child)

      by darkfeline (1030) on Tuesday April 03 2018, @03:06AM (#661783) Homepage

      The name is fine. Do people think airplane autopilot means the human pilot is going to take a nap? Fuck no.

      "Autopilot" is used entirely correctly here. You can't fix stupid. Someone is going to turn on the feature, see how well it works for 15 seconds, and just assume it's perfect.

      --
      Join the SDF Public Access UNIX System today!
      • (Score: 1, Insightful) by Anonymous Coward on Tuesday April 03 2018, @04:59AM

        by Anonymous Coward on Tuesday April 03 2018, @04:59AM (#661820)

        You're wrong. An autopilot can function without human intervention for 95 % or more of most flights. And it would be pretty safe for the human pilots to take a one hour nap, even if they don't do that for obvious reasons. A Tesla "Autopilot" can't do those things. Or maybe they should set it up to just try to drive like the crow flies in a straight line to the destination, no matter what the obstacles. That would be like a "real autopilot".

    • (Score: 3, Funny) by c0lo on Tuesday April 03 2018, @08:09AM

      by c0lo (156) Subscriber Badge on Tuesday April 03 2018, @08:09AM (#661858) Journal

      Something like "Driver Assist" is a less sexy sounding name, but would do a much better job of setting expectations.

      'Driver Assist' can be made a lot sexier. Literally, I mean. For a price, of course.
      Just sayin'.

      (grin)

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 0) by Anonymous Coward on Tuesday April 03 2018, @08:21AM

      by Anonymous Coward on Tuesday April 03 2018, @08:21AM (#661859)

      While I agree that calling it "Autopilot" makes people believe that it can do all those things that autopilots can do in movies, that's only half of the problem.

      The other part is that the human brain is entirely unfit for tirelessly monitoring a situation, ready to take over at split second notice. Humans get bored, start playing with their phones, daydream, fall asleep, etc. They got the whole thing backwards. Computers are good at monitoring and taking over at split second notice, so it should be the computer ready to take over if the driver is about to kill someone, and they expand slowly from that side until they've got everything sorted out. NOT letting the computer do the driving and requiring the person to take over once he's fallen asleep.

    • (Score: 3, Interesting) by tonyPick on Tuesday April 03 2018, @08:28AM

      by tonyPick (1237) on Tuesday April 03 2018, @08:28AM (#661860) Homepage Journal

      I think these problems all stem from a badly chosen name deliberately misleading name, selected by Tesla's Marketing department to make the car sound better than it is, and sell more units.

      FTFY.

  • (Score: 1, Funny) by Anonymous Coward on Monday April 02 2018, @09:34PM (1 child)

    by Anonymous Coward on Monday April 02 2018, @09:34PM (#661672)

    I thought we were talking about their stock value.

(1)