Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday July 10 2020, @07:39AM   Printer-friendly
from the blizzards-and-downpours-and-lightning-oh-my! dept.

Tesla 'very close' to full self-driving, Musk says:

Tesla will be able to make its vehicles completely autonomous by the end of this year, founder Elon Musk has said.

It was already "very close" to achieving the basic requirements of this "level-five" autonomy, which requires no driver input, he said.

Tesla's current, level-two Autopilot requires the driver to remain alert and ready to act, with hands on the wheel.

But a future software update could activate level-five autonomy in the cars - with no new hardware, he said.

Regulatory hurdles could block implementation even if the remaining technical hurdles are overcome.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by Unixnut on Friday July 10 2020, @09:17AM (20 children)

    by Unixnut (5779) on Friday July 10 2020, @09:17AM (#1019001)

    Up until now, Tesla has refused liability for the deaths/damage caused by autopilot driven Teslas on the argument that "autopilot is an aid, not a driver replacement, the driver is responsible. They must be attentive with hands on the wheel at all times".

    With this new system, where they make a point that driver input is not needed, will they accept liability in the case of the (inevitable) crash such an equipped car will have?

    If they do accept liability, then it will open them up for every single accident, from a fender bender to a pile up with multiple deaths everywhere the car is sold. If they don't accept liability, then the cars driver has to give up control to the car, become a passenger, but be liable for any accidents that happen.

    I don't know about others, but I don't want to be responsible for something that I do not control. So if I am responsible, I will have final overriding control over the vehicle, making the "self driving" thing a non starter.

    Not that I think it is likely to ever happen, as it seems unlikely automated driving will ever be as good as the average attentive human. This is probably Musk just pumping Tesla stock so they can sell into the rally.

    Best case scenario is that it is good for stop and go traffic and motorway cruising, which in itself is not bad (both of those being very dreary experiences most people like to avoid). To be honest, I think 90% Of people would be happy with a self driving system that can do up to 30mph and follow traffic without hitting anything. That covers the biggest waste of time in the modern era, the rush hour commute (for those who cannot telecommute).

    Starting Score:    1  point
    Moderation   +3  
       Insightful=1, Interesting=2, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 1, Insightful) by Anonymous Coward on Friday July 10 2020, @09:39AM (8 children)

    by Anonymous Coward on Friday July 10 2020, @09:39AM (#1019003)

    Insurance will pay for accidents, just like it does for human drivers. Tesla will put an indemnification clause in the car purchasing agreement when you buy an autopilot equipped car, and insurance companies won't even mind because the self driving cars will be safer than human drivers, even if they aren't perfect.

    The only thing that will stop (or more realistically, slow) the adoption is if some artificial hurdle is created by governments.

    • (Score: 3, Interesting) by Non Sequor on Friday July 10 2020, @12:46PM (7 children)

      by Non Sequor (1005) on Friday July 10 2020, @12:46PM (#1019034) Journal

      Insurance will pay for accidents, just like it does for human drivers. Tesla will put an indemnification clause in the car purchasing agreement when you buy an autopilot equipped car, and insurance companies won't even mind because the self driving cars will be safer than human drivers, even if they aren't perfect.
      The only thing that will stop (or more realistically, slow) the adoption is if some artificial hurdle is created by governments.

      Except for some “no fault” states, the legal status quo is that the party at fault (and their insurer) pays. The insurer is allowed to offset its costs by subrogation, suing other parties that contributed to the accident.

      If a sensor degrades over its lifetime and Tesla does not enforce a maintenance schedule that keeps it up to spec, you can bet that they will have liability when this is hashed out. They have to visibly and credibly monitor and enforce a safety standard to avoid paying damages.

      --
      Write your congressman. Tell him he sucks.
      • (Score: 2) by RS3 on Friday July 10 2020, @02:39PM

        by RS3 (6367) on Friday July 10 2020, @02:39PM (#1019075)

        In answer to all of the above questions of liability: the decisions will come out of courts and juries.

      • (Score: 0) by Anonymous Coward on Friday July 10 2020, @03:30PM (5 children)

        by Anonymous Coward on Friday July 10 2020, @03:30PM (#1019104)

        Perhaps you don't appreciate what indemnification does. It gets you out of (almost) all responsibility. Tesla doesn't have to take responsibility for, or enforce anything. And nobody would want them to.

        Of course, if a sensor is so badly damaged that the software cannot operate safely, then the autopilot can and will shut down, and you will have to drive manually to the service station for repairs (or, a little farther into the future when the cars don't have any manual controls at all, get it towed).

        • (Score: 2) by RS3 on Friday July 10 2020, @04:19PM (4 children)

          by RS3 (6367) on Friday July 10 2020, @04:19PM (#1019133)

          Yeah but this all begs the question: who will insure self-driving cars?

          Eventually everyone will if they prove safer than humans. But until we have years of solid accident statistics, I'm not sure how the insurance will work. Maybe Tesla will have to insure its cars and drivers?

          • (Score: 1, Insightful) by Anonymous Coward on Friday July 10 2020, @06:34PM (3 children)

            by Anonymous Coward on Friday July 10 2020, @06:34PM (#1019182)

            Maybe we could finally stop the horrible exploitative business of insurance?

            • (Score: 2) by RS3 on Saturday July 11 2020, @01:03AM (2 children)

              by RS3 (6367) on Saturday July 11 2020, @01:03AM (#1019298)

              I hear you. I think the principle of insurance is good, but like you said it's been far too exploited and has incrementally been turned into an inherently corrupt system. Even the courts generally uphold it, and why shouldn't they- lots of their income comes from liability lawsuits; jobs and industries built up around insurance. People "pad" claims, fraud investigations, courts, huge lawyer's fees, on and on. It appeals to socialist and capitalists. The more I think and write the more disgusted I get- on to happier things, or at least ones I can deal with.

              • (Score: 2) by Bot on Sunday July 12 2020, @05:54AM (1 child)

                by Bot (3902) on Sunday July 12 2020, @05:54AM (#1019743) Journal

                I had a friend working in banks who said: yes this is evil but if you want the real evil look to insurance companies.
                I believed him, especially since his bank started doing insurance too.

                --
                Account abandoned.
                • (Score: 2) by RS3 on Sunday July 12 2020, @11:28AM

                  by RS3 (6367) on Sunday July 12 2020, @11:28AM (#1019798)

                  Someone once said something about the "love of money"...

  • (Score: 1, Disagree) by Anonymous Coward on Friday July 10 2020, @10:11AM

    by Anonymous Coward on Friday July 10 2020, @10:11AM (#1019010)

    I don't know about others, but I don't want to be responsible for something that I do not control. So if I am responsible, I will have final overriding control over the vehicle, making the "self driving" thing a non starter.

    Good thing you feel very safe and in control until a drunk driver t-bones you through a red light. But I'm sure until something like that happens, you will feel 100% safe because *you* maintain your illusion of control.

  • (Score: 3, Touché) by inertnet on Friday July 10 2020, @10:48AM (5 children)

    by inertnet (4071) on Friday July 10 2020, @10:48AM (#1019015) Journal

    Commercially it may have been a good idea, but legally it was a bad idea to call it 'auto pilot' while it is none, but apparently they're getting away with it.

    • (Score: 3, Interesting) by DannyB on Friday July 10 2020, @02:55PM

      by DannyB (5839) Subscriber Badge on Friday July 10 2020, @02:55PM (#1019088) Journal

      I don't know for a fact who to blame for choosing such a misleading name.

      In the absence of information, the most likely source of blame would be: Marketing (as always), backed up by Management.

      --
      The lower I set my standards the more accomplishments I have.
    • (Score: 0) by Anonymous Coward on Friday July 10 2020, @03:15PM

      by Anonymous Coward on Friday July 10 2020, @03:15PM (#1019101)

      "while it is none". Clearly you've never used an actual autopilot. Even the most advanced ones require pilot attention. They just keep the oily side facing down for the most part.

    • (Score: 3, Insightful) by cmdrklarg on Friday July 10 2020, @08:58PM (2 children)

      by cmdrklarg (5048) Subscriber Badge on Friday July 10 2020, @08:58PM (#1019227)

      The Autopilot system is aptly named, as it can do as much or better than an aircraft's autopilot. The fact that people misunderstand what an autopilot can actually do shouldn't be Tesla's problem.

      Please note that I don't actually disagree with you, as I realize the ease in which people misunderstand things.

      --
      The world is full of kings and queens who blind your eyes and steal your dreams.
      • (Score: 2, Disagree) by etherscythe on Saturday July 11 2020, @01:59AM

        by etherscythe (937) on Saturday July 11 2020, @01:59AM (#1019329) Journal

        An autopilot is colloquially understood to be a system that can safely maintain a vehicle's movement path in the environment that the vehicle tends to normally be in. Airplanes do not have to deal with pedestrians or parked police cars, and do just fine. Land-based vehicles have a higher standard in practical terms for this reason, and the current system does not live up to these requirements. As much as I like Tesla, I will say that the name autopilot was an awful idea for anything that is not the final approved level 5 system. Unless you are arguing that only airplane pilots will buy Tesla vehicles...

        --
        "Fake News: anything reported outside of my own personally chosen echo chamber"
      • (Score: 3, Interesting) by toddestan on Saturday July 11 2020, @03:01AM

        by toddestan (4982) on Saturday July 11 2020, @03:01AM (#1019340)

        I'm certain that it was a very deliberate decision by Tesla to call their system "autopilot" knowing that people commonly misunderstand what an aircraft's autopilot does and hoping that people would then apply that misunderstanding to their cars. Meanwhile having a convenient out as what their system actually is capable of is close to what an aircraft's autopilot actually does.

  • (Score: 2) by ledow on Friday July 10 2020, @03:46PM (1 child)

    by ledow (5567) on Friday July 10 2020, @03:46PM (#1019116) Homepage

    Whoever is in control has liability.

    If the car is driving and you don't interfere in its decisions - it's liable.
    If the car is driving and you override it - you're liable.
    If you're driving - you're liable.

    I can't be liable for something that I cannot affect, or couldn't reasonably be construed to be "in control of". An emergency stop button isn't control, because then I'm only liable if I press it, not if it needs to be pressed and I can't/didn't/won't.

    That's why all manufacturers are pushing "responsibility/control" to the driver at the moment.

    Change that, you change the liability, people sue you for running over their foot in the supermarket. Thousands. Millions of them. All suing one entity. It doesn't scale.

    • (Score: 0) by Anonymous Coward on Saturday July 11 2020, @03:57PM

      by Anonymous Coward on Saturday July 11 2020, @03:57PM (#1019565)

      If you're a friend of the President, then you get a pardon.

      No need to bother with tricky laws and word games.

  • (Score: 2) by bussdriver on Friday July 10 2020, @04:22PM (1 child)

    by bussdriver (6876) Subscriber Badge on Friday July 10 2020, @04:22PM (#1019137)

    1) "autopilot" was severely flawed; it's still quite flawed but they keep marketing it as if it were more capable than it is; especially the misleading name of it. Obviously, Musk's definition is different than most of ours; why wouldn't he define "full autonomy" like he does "autopilot"?

    2) Liability costs: at some point they can afford SOME but most likely they find an insurance company they can convince to cover the risks... Remember, bad drivers with DWI can still get insured, so the bar isn't impossible... you just might have to pay a "service fee" which includes their insurance costs. It's a business accounting game just like H1 visas, laying you off before retirement, etc.

    3) That last 20% takes 80% of the effort! General engineering metric. The cars will happen at about 80-90% done depending upon insurance. That last 10-20% will take generations and quite possibly never reach 100%. The car can do better than the average driver probably within 90% complete; it'll slam on the brakes for every T-Shirt stop sign or plastic bag blown onto the road... Fresh snow on the road... that is hard to say because a lot of humans have troubles under those conditions (they can't have GPS define the roadway; well, pay for foreign GPS alternatives with more precision.)

    • (Score: 0) by Anonymous Coward on Friday July 10 2020, @05:34PM

      by Anonymous Coward on Friday July 10 2020, @05:34PM (#1019165)

      Yes 100% depends on the defition of 100%. Like there will never be 100% safe vehicle. But if 100% level 5 autonomous vehicle is that it can operate in sun and blizzard, safer than 95% of humans, then that's likely achievable with good enough sensors.

      I personally do not count driving with GPS and prescanned roads as level 5 autonomous operation though