Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday July 10 2020, @07:39AM   Printer-friendly
from the blizzards-and-downpours-and-lightning-oh-my! dept.

Tesla 'very close' to full self-driving, Musk says:

Tesla will be able to make its vehicles completely autonomous by the end of this year, founder Elon Musk has said.

It was already "very close" to achieving the basic requirements of this "level-five" autonomy, which requires no driver input, he said.

Tesla's current, level-two Autopilot requires the driver to remain alert and ready to act, with hands on the wheel.

But a future software update could activate level-five autonomy in the cars - with no new hardware, he said.

Regulatory hurdles could block implementation even if the remaining technical hurdles are overcome.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by Anonymous Coward on Friday July 10 2020, @08:02AM (4 children)

    by Anonymous Coward on Friday July 10 2020, @08:02AM (#1018985)

    Right into a stopped police car.

    • (Score: 2) by c0lo on Friday July 10 2020, @01:55PM (3 children)

      by c0lo (156) Subscriber Badge on Friday July 10 2020, @01:55PM (#1019059) Journal

      Which makes them quite dangerous when "very close" (grin)

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by Freeman on Friday July 10 2020, @02:52PM (2 children)

        by Freeman (732) on Friday July 10 2020, @02:52PM (#1019085) Journal

        All objects of significant enough size+mass moving at a significant enough speed are dangerous when "very close".

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
        • (Score: 1, Funny) by Anonymous Coward on Friday July 10 2020, @05:22PM

          by Anonymous Coward on Friday July 10 2020, @05:22PM (#1019164)

          There's no need to make yo momma jokes so complicated. Just look at this post.

        • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @01:40PM

          by Anonymous Coward on Saturday July 11 2020, @01:40PM (#1019489)

          Well that's what she said.

  • (Score: 2) by fraxinus-tree on Friday July 10 2020, @08:23AM (6 children)

    by fraxinus-tree (5590) on Friday July 10 2020, @08:23AM (#1018989)

    The good thing is that the driving license soon will not be considered a necessity and a lot of substandard drivers could be deprived of it without making them a substandard citizens.

    OTOH, it will be gradually made mandatory before reaching the abilities of the average driver.

    • (Score: 3, Insightful) by DannyB on Friday July 10 2020, @02:23PM (5 children)

      by DannyB (5839) Subscriber Badge on Friday July 10 2020, @02:23PM (#1019068) Journal

      This is actually a global problem. Calling for a global solution. Self driving cars would gradually appear globally. Thus driver licenses could globally disappear -- requiring a new mandatory global identification system. For your safety!

      Also for your safety, suppose the car knows the identify of every person in the vehicle. There would be no need for the police state to pull vehicles over in order to determine who is in the vehicle and whether they have any warrants. There would be no reason to pull over self driven vehicles, with all occupants identities known, other than for the amusement of the police.

      If the benevolent older male sibling ("big brother") could constantly know the whereabouts of every single person on Earth, imagine how glorious it would be. It would enable queries such as: "Computer, locate Wesley Crusher."

      No more high speed chases. Vehicles would pull over for police at their slightest whim.

      People would only think good thoughts and nobody would question the benevolence of the state.

      --
      What doesn't kill me makes me weaker for next time.
      • (Score: 1, Insightful) by Anonymous Coward on Friday July 10 2020, @02:33PM (1 child)

        by Anonymous Coward on Friday July 10 2020, @02:33PM (#1019073)

        Yes however the police will be gone by then because BLM.

      • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @04:25AM

        by Anonymous Coward on Saturday July 11 2020, @04:25AM (#1019380)

        Also for your safety, suppose the car knows the identify of every person in the vehicle. There would be no need for the police state to pull vehicles over in order to determine who is in the vehicle and whether they have any warrants. There would be no reason to pull over self driven vehicles, with all occupants identities known, other than for the amusement of the police.

        If the benevolent older male sibling ("big brother") could constantly know the whereabouts of every single person on Earth, imagine how glorious it would be. It would enable queries such as: "Computer, locate Wesley Crusher."

        mobile phones already solve this problem for the "benevolent older male sibling"

      • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @01:49PM

        by Anonymous Coward on Saturday July 11 2020, @01:49PM (#1019493)

        It's a miracle that some people can still function in the world. If you need your privacy that badly, it's a fetish. Like killing all the germs or not seeing adverts in every possible place. The world is not here to satisfy your fetish.

      • (Score: 2) by Bot on Sunday July 12 2020, @05:39AM

        by Bot (3902) on Sunday July 12 2020, @05:39AM (#1019740) Journal

        >This is actually a global problem. Calling for a global solution.

        Every time I hear this phrase I reach for my revolver. Unfortunately I don't own one.

        Take a hypothetical dangerous virus. It is a global problem, the solution can perfectly be local. In fact the mayor of a place who simply closed up the streets in and out his town and quarantined/sanitized the incoming people/stuff dealt with 0 cases. Meanwhile the global solution embodied by the OMS, CAUSED AVOIDABLE AND INHUMAN DEATHS AND QUICKENED AN ECONOMIC COLLAPSE.

        The phrase is a slogan. The solutions are effective or ineffective, sustainable or unsustainable. Whether they are global or not is relevant only to those wanting to exercise power. If you're not one of them, do not advocate them KTHXBYE

        --
        Account abandoned.
  • (Score: 2, Interesting) by Anonymous Coward on Friday July 10 2020, @08:54AM (16 children)

    by Anonymous Coward on Friday July 10 2020, @08:54AM (#1018997)

    Even if one resolves any and all glitches, which is itself probably impossible, there will always be unavoidable circumstances that lead to a bad outcome.

    And what will happen when somebody dies in one of these accidents? People have become knee jerk reactionaries. The argument will that the vehicles are provably safer than humans will be overwhelmed by a tide of "THIS CAR KILLED A BABY! DO YOU LIKE KILLING BABIES? NO? THEN WE MUST BAN THIS CAR! [upvote me, like me, send me money on patreon]" And the more vehicles that are on the road, the more accidents will happen. Imagine we hit peak autonomous driving and there were still 10 fatal accidents per day in the US alone. Well that'd be completely awesome - that'd be measurably literally 1000% safer than humans since we currently have about 37,000 traffic deaths per year. But people would be misled by idiots and easily manipulated emotions.

    The only places self driving will be able to thrive are those with governments that don't really care about reactionary responses which pretty much eliminates western democracies.

    • (Score: 0, Offtopic) by Anonymous Coward on Friday July 10 2020, @10:10AM

      by Anonymous Coward on Friday July 10 2020, @10:10AM (#1019009)

      If its like our government's response to covid 19, they will ban all automobiles because there are some that have autopilot.

    • (Score: 3, Insightful) by inertnet on Friday July 10 2020, @11:19AM (10 children)

      by inertnet (4071) on Friday July 10 2020, @11:19AM (#1019022) Journal

      I'm convinced that it will exist someday, but we're still in the stone age of artificial intelligence. I agree that it won't be possible with traditional coding, containing millions of decision trees. But I'm sure that in time, true AI auto pilots will become better than human drivers and death rates from accidents will drop significantly. AI will use similar decision making as humans, but without the stupid parts and without the freezing in face of danger response. Also without DUI, bad eyesight, old age, raging hormones and other limiting factors. Current autopilots still are inexperienced babies with poor eyesight though.

      • (Score: 3, Touché) by PiMuNu on Friday July 10 2020, @01:21PM (7 children)

        by PiMuNu (3823) on Friday July 10 2020, @01:21PM (#1019046)

        > I'm convinced that it will exist someday

        Why?

        • (Score: 5, Insightful) by DannyB on Friday July 10 2020, @02:27PM (4 children)

          by DannyB (5839) Subscriber Badge on Friday July 10 2020, @02:27PM (#1019069) Journal

          IMO: because of the sheer amount of wasted human productivity spent on driving automobiles that could be used for better things.

          The motive to develop full self driving is very high.

          --
          What doesn't kill me makes me weaker for next time.
          • (Score: 2) by PiMuNu on Friday July 10 2020, @02:46PM (1 child)

            by PiMuNu (3823) on Friday July 10 2020, @02:46PM (#1019079)

            As is the motive to develop teleportation!

            • (Score: 2) by DannyB on Friday July 10 2020, @03:59PM

              by DannyB (5839) Subscriber Badge on Friday July 10 2020, @03:59PM (#1019121) Journal

              True. But I suspect self driving cars will happen prior to teleportation.

              --
              What doesn't kill me makes me weaker for next time.
          • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @01:52PM

            by Anonymous Coward on Saturday July 11 2020, @01:52PM (#1019494)

            > that could be used for better things.

            The next invention will be TVs in cars. Your point was?

          • (Score: 3, Insightful) by Bot on Sunday July 12 2020, @05:49AM

            by Bot (3902) on Sunday July 12 2020, @05:49AM (#1019742) Journal

            I'll tell you a secret. We don't need more productivity. We need productivity to be aimed at well being not at furthering the control of a cabal on the rest of humanity.

            We have overproduction and trouble with jobs.
            Say you command an army HQ full of draftees. you need half, even 1/10th of them to run the place. The logical solution is to take turns. But if you were a central banker instead of an army general you would do like this: Take 10th of the force and make them work 24/7 and restrict food/goods to the rest. Create 2 parties, one who says that the worker should have more power since all depends on them, one who says that the worker should share with everybody his food/goods because they are privileged. So instead of concentrating on how to become a soldier, we have everybody scrambling not to lose a job or to obtain one.

            --
            Account abandoned.
        • (Score: 2) by inertnet on Friday July 10 2020, @03:23PM (1 child)

          by inertnet (4071) on Friday July 10 2020, @03:23PM (#1019102) Journal

          Because the only thing that we don't have yet is good enough AI. Technology already beats us with eyesight (and not only visible light), hearing, attention, fatigue, hormones, shock seconds, and many other things that make humans average drivers at best.

          • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @01:54PM

            by Anonymous Coward on Saturday July 11 2020, @01:54PM (#1019496)

            So the only thing left is the hard part. Cool, sounds like you're almost there. We used to have a little in-joke about that.

            1. Invent a car
            2. Remove the driver
            3. ???
            4. Profit

      • (Score: 2, Disagree) by Freeman on Friday July 10 2020, @04:38PM (1 child)

        by Freeman (732) on Friday July 10 2020, @04:38PM (#1019144) Journal

        "True AI" is a myth. How do you create a machine that is capable of making it's own decisions? How do you prove it? Given that it were to seem like it was making it's own decisions. It would be functioning in the way it's designers, created it to, it couldn't function any other way.

        --
        Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
        • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @09:14AM

          by Anonymous Coward on Saturday July 11 2020, @09:14AM (#1019436)

          Sure we can. Bottom right panel of https://xkcd.com/413/ [xkcd.com]

    • (Score: 1, Insightful) by Anonymous Coward on Friday July 10 2020, @06:47PM (3 children)

      by Anonymous Coward on Friday July 10 2020, @06:47PM (#1019187)

      Given how few republicans seem to care about the number of people killed by Covid, I think you are exaggerating things.

      • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @01:58PM

        by Anonymous Coward on Saturday July 11 2020, @01:58PM (#1019497)

        Exactly.

        When the cost of a Tesla Self-Masturbating Penis Compensator drops below that of a normal car, or truck or whatever, expect the change to come whether it's good for you the menial wage slave, die die die, or not.

      • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @02:24PM (1 child)

        by Anonymous Coward on Saturday July 11 2020, @02:24PM (#1019511)

        It's dumb to turn COVID partisan when one side wants to open and the other side wants to mass demonstrate. Neither side really cares about the consequences and/or feel that their justification is more relevant than the deaths it will inevitably cause.

        • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @03:20PM

          by Anonymous Coward on Saturday July 11 2020, @03:20PM (#1019539)

          Well hardly. One side wants you to get back in the firing line and the other side wants to dismantle the system of conscription.

  • (Score: 5, Interesting) by Unixnut on Friday July 10 2020, @09:17AM (20 children)

    by Unixnut (5779) on Friday July 10 2020, @09:17AM (#1019001)

    Up until now, Tesla has refused liability for the deaths/damage caused by autopilot driven Teslas on the argument that "autopilot is an aid, not a driver replacement, the driver is responsible. They must be attentive with hands on the wheel at all times".

    With this new system, where they make a point that driver input is not needed, will they accept liability in the case of the (inevitable) crash such an equipped car will have?

    If they do accept liability, then it will open them up for every single accident, from a fender bender to a pile up with multiple deaths everywhere the car is sold. If they don't accept liability, then the cars driver has to give up control to the car, become a passenger, but be liable for any accidents that happen.

    I don't know about others, but I don't want to be responsible for something that I do not control. So if I am responsible, I will have final overriding control over the vehicle, making the "self driving" thing a non starter.

    Not that I think it is likely to ever happen, as it seems unlikely automated driving will ever be as good as the average attentive human. This is probably Musk just pumping Tesla stock so they can sell into the rally.

    Best case scenario is that it is good for stop and go traffic and motorway cruising, which in itself is not bad (both of those being very dreary experiences most people like to avoid). To be honest, I think 90% Of people would be happy with a self driving system that can do up to 30mph and follow traffic without hitting anything. That covers the biggest waste of time in the modern era, the rush hour commute (for those who cannot telecommute).

    • (Score: 1, Insightful) by Anonymous Coward on Friday July 10 2020, @09:39AM (8 children)

      by Anonymous Coward on Friday July 10 2020, @09:39AM (#1019003)

      Insurance will pay for accidents, just like it does for human drivers. Tesla will put an indemnification clause in the car purchasing agreement when you buy an autopilot equipped car, and insurance companies won't even mind because the self driving cars will be safer than human drivers, even if they aren't perfect.

      The only thing that will stop (or more realistically, slow) the adoption is if some artificial hurdle is created by governments.

      • (Score: 3, Interesting) by Non Sequor on Friday July 10 2020, @12:46PM (7 children)

        by Non Sequor (1005) on Friday July 10 2020, @12:46PM (#1019034) Journal

        Insurance will pay for accidents, just like it does for human drivers. Tesla will put an indemnification clause in the car purchasing agreement when you buy an autopilot equipped car, and insurance companies won't even mind because the self driving cars will be safer than human drivers, even if they aren't perfect.
        The only thing that will stop (or more realistically, slow) the adoption is if some artificial hurdle is created by governments.

        Except for some “no fault” states, the legal status quo is that the party at fault (and their insurer) pays. The insurer is allowed to offset its costs by subrogation, suing other parties that contributed to the accident.

        If a sensor degrades over its lifetime and Tesla does not enforce a maintenance schedule that keeps it up to spec, you can bet that they will have liability when this is hashed out. They have to visibly and credibly monitor and enforce a safety standard to avoid paying damages.

        --
        Write your congressman. Tell him he sucks.
        • (Score: 2) by RS3 on Friday July 10 2020, @02:39PM

          by RS3 (6367) on Friday July 10 2020, @02:39PM (#1019075)

          In answer to all of the above questions of liability: the decisions will come out of courts and juries.

        • (Score: 0) by Anonymous Coward on Friday July 10 2020, @03:30PM (5 children)

          by Anonymous Coward on Friday July 10 2020, @03:30PM (#1019104)

          Perhaps you don't appreciate what indemnification does. It gets you out of (almost) all responsibility. Tesla doesn't have to take responsibility for, or enforce anything. And nobody would want them to.

          Of course, if a sensor is so badly damaged that the software cannot operate safely, then the autopilot can and will shut down, and you will have to drive manually to the service station for repairs (or, a little farther into the future when the cars don't have any manual controls at all, get it towed).

          • (Score: 2) by RS3 on Friday July 10 2020, @04:19PM (4 children)

            by RS3 (6367) on Friday July 10 2020, @04:19PM (#1019133)

            Yeah but this all begs the question: who will insure self-driving cars?

            Eventually everyone will if they prove safer than humans. But until we have years of solid accident statistics, I'm not sure how the insurance will work. Maybe Tesla will have to insure its cars and drivers?

            • (Score: 1, Insightful) by Anonymous Coward on Friday July 10 2020, @06:34PM (3 children)

              by Anonymous Coward on Friday July 10 2020, @06:34PM (#1019182)

              Maybe we could finally stop the horrible exploitative business of insurance?

              • (Score: 2) by RS3 on Saturday July 11 2020, @01:03AM (2 children)

                by RS3 (6367) on Saturday July 11 2020, @01:03AM (#1019298)

                I hear you. I think the principle of insurance is good, but like you said it's been far too exploited and has incrementally been turned into an inherently corrupt system. Even the courts generally uphold it, and why shouldn't they- lots of their income comes from liability lawsuits; jobs and industries built up around insurance. People "pad" claims, fraud investigations, courts, huge lawyer's fees, on and on. It appeals to socialist and capitalists. The more I think and write the more disgusted I get- on to happier things, or at least ones I can deal with.

                • (Score: 2) by Bot on Sunday July 12 2020, @05:54AM (1 child)

                  by Bot (3902) on Sunday July 12 2020, @05:54AM (#1019743) Journal

                  I had a friend working in banks who said: yes this is evil but if you want the real evil look to insurance companies.
                  I believed him, especially since his bank started doing insurance too.

                  --
                  Account abandoned.
                  • (Score: 2) by RS3 on Sunday July 12 2020, @11:28AM

                    by RS3 (6367) on Sunday July 12 2020, @11:28AM (#1019798)

                    Someone once said something about the "love of money"...

    • (Score: 1, Disagree) by Anonymous Coward on Friday July 10 2020, @10:11AM

      by Anonymous Coward on Friday July 10 2020, @10:11AM (#1019010)

      I don't know about others, but I don't want to be responsible for something that I do not control. So if I am responsible, I will have final overriding control over the vehicle, making the "self driving" thing a non starter.

      Good thing you feel very safe and in control until a drunk driver t-bones you through a red light. But I'm sure until something like that happens, you will feel 100% safe because *you* maintain your illusion of control.

    • (Score: 3, Touché) by inertnet on Friday July 10 2020, @10:48AM (5 children)

      by inertnet (4071) on Friday July 10 2020, @10:48AM (#1019015) Journal

      Commercially it may have been a good idea, but legally it was a bad idea to call it 'auto pilot' while it is none, but apparently they're getting away with it.

      • (Score: 3, Interesting) by DannyB on Friday July 10 2020, @02:55PM

        by DannyB (5839) Subscriber Badge on Friday July 10 2020, @02:55PM (#1019088) Journal

        I don't know for a fact who to blame for choosing such a misleading name.

        In the absence of information, the most likely source of blame would be: Marketing (as always), backed up by Management.

        --
        What doesn't kill me makes me weaker for next time.
      • (Score: 0) by Anonymous Coward on Friday July 10 2020, @03:15PM

        by Anonymous Coward on Friday July 10 2020, @03:15PM (#1019101)

        "while it is none". Clearly you've never used an actual autopilot. Even the most advanced ones require pilot attention. They just keep the oily side facing down for the most part.

      • (Score: 3, Insightful) by cmdrklarg on Friday July 10 2020, @08:58PM (2 children)

        by cmdrklarg (5048) Subscriber Badge on Friday July 10 2020, @08:58PM (#1019227)

        The Autopilot system is aptly named, as it can do as much or better than an aircraft's autopilot. The fact that people misunderstand what an autopilot can actually do shouldn't be Tesla's problem.

        Please note that I don't actually disagree with you, as I realize the ease in which people misunderstand things.

        --
        The world is full of kings and queens who blind your eyes and steal your dreams.
        • (Score: 2, Disagree) by etherscythe on Saturday July 11 2020, @01:59AM

          by etherscythe (937) on Saturday July 11 2020, @01:59AM (#1019329) Journal

          An autopilot is colloquially understood to be a system that can safely maintain a vehicle's movement path in the environment that the vehicle tends to normally be in. Airplanes do not have to deal with pedestrians or parked police cars, and do just fine. Land-based vehicles have a higher standard in practical terms for this reason, and the current system does not live up to these requirements. As much as I like Tesla, I will say that the name autopilot was an awful idea for anything that is not the final approved level 5 system. Unless you are arguing that only airplane pilots will buy Tesla vehicles...

          --
          "Fake News: anything reported outside of my own personally chosen echo chamber"
        • (Score: 3, Interesting) by toddestan on Saturday July 11 2020, @03:01AM

          by toddestan (4982) on Saturday July 11 2020, @03:01AM (#1019340)

          I'm certain that it was a very deliberate decision by Tesla to call their system "autopilot" knowing that people commonly misunderstand what an aircraft's autopilot does and hoping that people would then apply that misunderstanding to their cars. Meanwhile having a convenient out as what their system actually is capable of is close to what an aircraft's autopilot actually does.

    • (Score: 2) by ledow on Friday July 10 2020, @03:46PM (1 child)

      by ledow (5567) on Friday July 10 2020, @03:46PM (#1019116) Homepage

      Whoever is in control has liability.

      If the car is driving and you don't interfere in its decisions - it's liable.
      If the car is driving and you override it - you're liable.
      If you're driving - you're liable.

      I can't be liable for something that I cannot affect, or couldn't reasonably be construed to be "in control of". An emergency stop button isn't control, because then I'm only liable if I press it, not if it needs to be pressed and I can't/didn't/won't.

      That's why all manufacturers are pushing "responsibility/control" to the driver at the moment.

      Change that, you change the liability, people sue you for running over their foot in the supermarket. Thousands. Millions of them. All suing one entity. It doesn't scale.

      • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @03:57PM

        by Anonymous Coward on Saturday July 11 2020, @03:57PM (#1019565)

        If you're a friend of the President, then you get a pardon.

        No need to bother with tricky laws and word games.

    • (Score: 2) by bussdriver on Friday July 10 2020, @04:22PM (1 child)

      by bussdriver (6876) Subscriber Badge on Friday July 10 2020, @04:22PM (#1019137)

      1) "autopilot" was severely flawed; it's still quite flawed but they keep marketing it as if it were more capable than it is; especially the misleading name of it. Obviously, Musk's definition is different than most of ours; why wouldn't he define "full autonomy" like he does "autopilot"?

      2) Liability costs: at some point they can afford SOME but most likely they find an insurance company they can convince to cover the risks... Remember, bad drivers with DWI can still get insured, so the bar isn't impossible... you just might have to pay a "service fee" which includes their insurance costs. It's a business accounting game just like H1 visas, laying you off before retirement, etc.

      3) That last 20% takes 80% of the effort! General engineering metric. The cars will happen at about 80-90% done depending upon insurance. That last 10-20% will take generations and quite possibly never reach 100%. The car can do better than the average driver probably within 90% complete; it'll slam on the brakes for every T-Shirt stop sign or plastic bag blown onto the road... Fresh snow on the road... that is hard to say because a lot of humans have troubles under those conditions (they can't have GPS define the roadway; well, pay for foreign GPS alternatives with more precision.)

      • (Score: 0) by Anonymous Coward on Friday July 10 2020, @05:34PM

        by Anonymous Coward on Friday July 10 2020, @05:34PM (#1019165)

        Yes 100% depends on the defition of 100%. Like there will never be 100% safe vehicle. But if 100% level 5 autonomous vehicle is that it can operate in sun and blizzard, safer than 95% of humans, then that's likely achievable with good enough sensors.

        I personally do not count driving with GPS and prescanned roads as level 5 autonomous operation though

  • (Score: 5, Insightful) by xorsyst on Friday July 10 2020, @10:25AM (15 children)

    by xorsyst (1372) on Friday July 10 2020, @10:25AM (#1019012)

    On the nice wide roads of the USA, and the major trunk routes, maybe. On the narrow streets of European towns, not a chance. Just look at the current autopilot videos on youtube and laugh at how hideously dangerous it is.

    • (Score: 3, Insightful) by TheReaperD on Friday July 10 2020, @11:11AM (7 children)

      by TheReaperD (5556) on Friday July 10 2020, @11:11AM (#1019018)

      Even worse, the streets of New Delhi, India. Good freakin' luck getting any sort of auto pilot to work there.

      --
      Ad eundum quo nemo ante iit
      • (Score: 2) by PiMuNu on Friday July 10 2020, @01:19PM (2 children)

        by PiMuNu (3823) on Friday July 10 2020, @01:19PM (#1019045)

        Yeah, but manual pilot doesn't work there either.

      • (Score: 3, Interesting) by RS3 on Friday July 10 2020, @02:43PM (2 children)

        by RS3 (6367) on Friday July 10 2020, @02:43PM (#1019077)

        It would work if most cars had autopilot, especially if they're all inter-networked and coordinate movement. The world already has huge swarms of "drone" copters flying in formation, each having to compensate for varying winds, including those from other drones.

        • (Score: 2) by TheReaperD on Friday July 10 2020, @11:45PM (1 child)

          by TheReaperD (5556) on Friday July 10 2020, @11:45PM (#1019274)

          To do this, you have to replace every vehicle, not just cars as they have a lot of bikes, rickshaws, and other vehicles that people in the US may consider odd. Who's going to pay for all of that? You can't expect someone with a salary equivalent to a few dollars a day (migrant workers get shafted) to buy a car that costs $20,000-$30,000 US to manufacture before profits. And unless they're all changed out, you're going to have the problem as New Delhi's infrastructure has not kept up with its population boom.

          --
          Ad eundum quo nemo ante iit
          • (Score: 2) by RS3 on Saturday July 11 2020, @01:20AM

            by RS3 (6367) on Saturday July 11 2020, @01:20AM (#1019311)

            No need to challenge me- I agree with you. My lack of including all of that does not mean I didn't think it- I just didn't write a novel explaining everything, including a timeline. I was just briefly posing what the future scenario could possibly be like. It would take a long time to adopt / integrate on a wide scale, and nobody said it had to be done, or done everywhere.

            Frankly, if you think about it, if enough cars were fully autopilot, there'd be much less abrupt motion changes, and bikes, rickshaws, etc., would have an easier time.

      • (Score: 3, Funny) by DannyB on Friday July 10 2020, @02:57PM

        by DannyB (5839) Subscriber Badge on Friday July 10 2020, @02:57PM (#1019090) Journal

        Good freakin' luck getting any sort of auto pilot to work there.

        It depends upon the definition of "work".

        It may "work" just as amazingly well as outsourced software development "works".

        --
        What doesn't kill me makes me weaker for next time.
    • (Score: 3, Interesting) by RS3 on Friday July 10 2020, @02:46PM (6 children)

      by RS3 (6367) on Friday July 10 2020, @02:46PM (#1019080)

      Re: small windy towns- do those "autopilots" have pre-loaded maps they follow? Seems if they did it would be pretty easy to base the driving on known maps and make adjustments from there.

      Over the years there have been many proposals for some kind of guidance system embedded in roadways. RFID tags could work- they're cheap, easily installed.

      • (Score: 3, Insightful) by PiMuNu on Friday July 10 2020, @02:52PM (5 children)

        by PiMuNu (3823) on Friday July 10 2020, @02:52PM (#1019084)

        "Automated driver only" lanes on highways would be easy to implement and could stage in the automation, once there is a reasonable base level of adoption (say 10 % of cars have automated driver).

        • (Score: 0) by Anonymous Coward on Friday July 10 2020, @03:40PM (4 children)

          by Anonymous Coward on Friday July 10 2020, @03:40PM (#1019109)

          The question is what broader goal this serves. We have HOV lanes because governments want to encourage higher vehicle occupancy for more efficient road use. They don't really work, and they aren't as good as more public transit, but that's the goal.

          Self-driving only lanes, if they're separated physically from the rest of traffic, could potentially allow higher speed limits, but only if the roads are maintained adequately, and only if the increased energy cost is not a problem. And many existing HOV lanes don't have this.

          More likely this will only happen once self driving cars are considered to be much safer than human drivers, in which case there will already be so many of them that it will be more like closing certain roads to humans, sort of like how cities sometimes close roads to cars so that pedestrians or buses can have exclusive use of them.

          • (Score: 0) by Anonymous Coward on Friday July 10 2020, @04:56PM (3 children)

            by Anonymous Coward on Friday July 10 2020, @04:56PM (#1019151)

            They don't really work, and they aren't as good as more public
            transit, but that's the goal.

            I think HOV lanes can work in principle but that in practice they don't because they aren't built in a way that actually encourages carpooling.

            In my city, they added HOV lanes to the main throughway, presumably so someone can point and say "look at all the things we're doing to solve transportation problems", without any thought about how the road is actually used by commuters.

            Specifically, they added them to the widest sections of freeway only, where there is more than 3 lanes. So on the morning commute to the downtown core, the HOV lane ends just before you get to the 3-lane portion of the highway which incidentally is the only part of the highway that gets congested during the morning rush hour. On the reverse the HOV lane starts after the road is no longer congested.

            Needless to say there is rarely a reason to use the HOV lane even if you are carpooling.

            If they had the guts to literally run the HOV lane all the way across the city, leaving just two lanes for regular traffic in the downtown core, I bet we'd see a lot more carpooling. It also might help to give the HOV lane a higher speed limit (it is somewhat separated from adjacent traffic after all).

            Of course barely anyone is commuting right now so there probably aren't major congestion problems that need solving at the moment.

            • (Score: 2) by RS3 on Saturday July 11 2020, @01:23AM (2 children)

              by RS3 (6367) on Saturday July 11 2020, @01:23AM (#1019315)

              Then you get the people who put mannequins in their car and use the HOV lanes.
               

              • (Score: 2) by toddestan on Saturday July 11 2020, @03:06AM (1 child)

                by toddestan (4982) on Saturday July 11 2020, @03:06AM (#1019344)

                Not to mention all the cities that figure out by letting people drive solo in the HOV lanes for a fee, they suddenly have a nice new revenue source.

                Around here, the price scales depending on congestion, but is almost always cheaper than the express bus fare.

                • (Score: 2) by RS3 on Saturday July 11 2020, @03:32AM

                  by RS3 (6367) on Saturday July 11 2020, @03:32AM (#1019363)

                  Ugh. I don't know if I'd ever heard of that.

                  It's always about money. It reminds me of the traffic light cameras that send automatic tickets. If they really wanted to make intersections safer, they'd do like Russia and some other countries- they'd put a count-down counter up so you'd know when the light is about to turn yellow. And they'd make yellow long enough to give people a chance to stop (and not in an all-out panic mash the brake pedal way).

  • (Score: 0) by Anonymous Coward on Friday July 10 2020, @12:10PM

    by Anonymous Coward on Friday July 10 2020, @12:10PM (#1019028)

    Full self-driving Tesla tech is "very close" to failing miserably.

    I'll be here all day!

  • (Score: 3, Interesting) by looorg on Friday July 10 2020, @12:49PM

    by looorg (578) on Friday July 10 2020, @12:49PM (#1019037)

    Question or problem then is that this, a completely self driving car, probably isn't legal on the road. The law will probably be playing catch-up on this for a long time. I don't see this being a thing for quite some time.

    That said some people I know that got Tesla say they more or less already drive them (to and from work) and they are sort of spending more and more time just listening to audiobooks etc when they "drive".

    I wonder what this will entail for a "drivers license", after all if you don't actually have to drive the car anymore then you probably don't need to know the rules and laws and practice either. So the license will just become what? An owners permit?

  • (Score: 2) by Immerman on Friday July 10 2020, @01:57PM (9 children)

    by Immerman (3985) on Friday July 10 2020, @01:57PM (#1019061)

    Almost there - all that's left is the last 10% that takes 90% of the work.

    • (Score: 3, Insightful) by DannyB on Friday July 10 2020, @03:01PM (8 children)

      by DannyB (5839) Subscriber Badge on Friday July 10 2020, @03:01PM (#1019093) Journal

      Incorrect. It actually works like this. As one would expect.

      The first 90% of the result takes 90% of the work, time and cost.

      The remaining 10% of the result takes the other 90% of the work, time and cost.

      --
      What doesn't kill me makes me weaker for next time.
      • (Score: 2) by Immerman on Friday July 10 2020, @04:08PM (4 children)

        by Immerman (3985) on Friday July 10 2020, @04:08PM (#1019128)

        I have never heard it expressed that way. In fact, the Pareto principle (usually referring to the 80:20 rule rather than the 90:10 rule often used in software development) is often applied to distributions as well - e.g. 20% of the population owns 80% of the wealth. Not the smartass *other* 80%. 80% of the total.

        You get your project 90% done - it's basically working, all that's left is the the final few features and smoothing out the rough edges, handling corner cases, etc. Which when completed will have taken 90% of the total amount of work spent on the project.

        • (Score: 2) by Freeman on Friday July 10 2020, @04:42PM (3 children)

          by Freeman (732) on Friday July 10 2020, @04:42PM (#1019145) Journal

          https://www.forbes.com/sites/chunkamui/2018/02/28/driverless-cars-90-percent-done-90-percent-left-to-go/#54288ca520a0 [forbes.com]

          I've heard it before, not exactly sure how far back it goes, but it's a reasonably popular expression.

          --
          Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
          • (Score: 2) by Immerman on Friday July 10 2020, @08:29PM (2 children)

            by Immerman (3985) on Friday July 10 2020, @08:29PM (#1019213)

            Okay, so the 90-90 saying does exist. Still makes no logical sense (even if the meaning is clear), and I strongly suspect it's a bastardization of the 90/10 rule*, which itself is a more extreme version of the 80/20 rule, aka the Pareto Principle, which was coined in 1876.

            *a dramatically weaker bastardization at that, since it claims the 90% completion mark is the halfway point in terms of labor (the "other 90%", aka 50%, still to go), rather than the 10% mark

            • (Score: 2) by deimtee on Saturday July 11 2020, @09:32AM (1 child)

              by deimtee (3272) on Saturday July 11 2020, @09:32AM (#1019441) Journal

              I always thought it was having a go at over-optimistic estimators. 80% over original time and budget seems fairly usual.

              --
              If you cough while drinking cheap red wine it really cleans out your sinuses.
              • (Score: 3, Interesting) by Immerman on Saturday July 11 2020, @01:20PM

                by Immerman (3985) on Saturday July 11 2020, @01:20PM (#1019478)

                Not specifically, it doesn't even have anything to do with whether your estimate was accurate. It's more of an observational commentary about a frequently occurring exponential-like* distribution pattern.

                It's not totally unlike commenting on a bell curve: a whole lot of random distributions tend to follow a Gaussian distribution, with you might say follows the "68/1" rule - 68% of the results fall within 1 standard deviation of the mean.

                If instead it follows an exponential-like distribution, something like the 80/20 or 90/10 rule applies (depending on just how dramatically the exponential curve climbs). It seems to crop up a lot when you have self-amplifying effects at work - e.g. the richer you are, the easier it is to acquire an additional $1000, and so wealth distribution follows such a curve.

                For programming - the larger the code base, the more work is generally required to fix a bug or add a feature, so that adding the last 10% of features may take much longer than adding the first 90%. Ideally you'd keep that in mind when making an estimate, recognize that our brains are really terrible at forecasting exponential growth, and multiply your initial estimate by 5x to 10x to compensate for that grueling last mile of progress. (Or as sometimes expressed - take your first estimate, multiply by two, then switch to the next larger unit of time. Three weeks => six months.)

                *I say exponential-like because an "exponential distribution" refers to a very specific distribution curve, which I'm not sure it's actually relevant.

      • (Score: 2) by RS3 on Friday July 10 2020, @04:22PM (1 child)

        by RS3 (6367) on Friday July 10 2020, @04:22PM (#1019136)

        I don't know if you've ever listened to "Car Talk" but Ray used to refer to the "third half of our show".

        • (Score: 2) by DannyB on Friday July 10 2020, @04:26PM

          by DannyB (5839) Subscriber Badge on Friday July 10 2020, @04:26PM (#1019140) Journal

          I didn't listen, but my wife did, long ago. So I heard some of it.

          --
          What doesn't kill me makes me weaker for next time.
      • (Score: 0) by Anonymous Coward on Sunday July 12 2020, @06:46PM

        by Anonymous Coward on Sunday July 12 2020, @06:46PM (#1019960)

        That's why the developers need to give 110% otherwise it won't get finished [del]on time[/del] late.

(1)