Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday November 04 2018, @07:04AM   Printer-friendly
from the shouldn't-it-be-auto-driver? dept.

Submitted via IRC for Bytram

Another Tesla with Autopilot crashed into a stationary object—the driver is suing

Earlier this month, Shawn Hudson's Tesla Model S crashed into a stalled car while moving at about 80 miles per hour on a Florida freeway. Tesla's Autopilot technology was engaged at the time, and Hudson has now filed a lawsuit against Tesla in state courts.

"Through a pervasive national marketing campaign and a purposefully manipulative sales pitch, Tesla has duped consumers" into believing that Autpilot can "transport passengers at highway speeds with minimal input and oversight," the lawsuit says.

Hudson had a two-hour commute to his job at an auto dealership. He says that he heard about Tesla's Autopilot technology last year and went to a Tesla dealership to learn more.

"Tesla's sales representative reassured Hudson that all he needed to do as the driver of the vehicle is to occasionally place his hand on the steering wheel and that the vehicle would 'do everything else,'" the lawsuit claims.

Tesla blames driver in last month's fatal crash with Autopilot engaged

But that description of Tesla's Autopilot system is not true. While the system can handle a range of driving conditions, it's not designed to stop for parked cars or other stationary objects when traveling at highway speeds. This year, at least two other Tesla drivers have plowed into parked vehicles while their cars were in Autopilot mode (one of them sued Tesla last month). Another Tesla customer, Californian Walter Huang, was killed when his Tesla vehicle ran into a concrete lane divider at full speed.

"It is the driver's responsibility to remain attentive to their surroundings and in control of the vehicle at all times," a Tesla spokesman told Ars by email. "Tesla goes to great lengths to provide clear instructions about what Autopilot is and is not, including by offering driver instructions when owners test drive and take delivery of their car, before drivers enable Autopilot and every single time they use Autopilot, as well as through the Owner's Manual and Release Notes for software updates." (I've reproduced Tesla's full emailed statement at the end of the story.)


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Insightful) by ataradov on Sunday November 04 2018, @07:26AM (17 children)

    by ataradov (4776) on Sunday November 04 2018, @07:26AM (#757522) Homepage

    It does not matter what the salesman says. What matters is what the manual says. And I'm sure it was written by a cohort of lawyers and is spotless.

    • (Score: 5, Informative) by Blymie on Sunday November 04 2018, @08:16AM (9 children)

      by Blymie (4020) on Sunday November 04 2018, @08:16AM (#757531)

      Uh, no. It *does* matter what the salesman says, but in most cases you'd be suing a dealership for lying.

      In this case there are no dealerships, as Tesla is the only one that sells their cars. And any representative of a corp, binds that corp to verbal contracts, potential issues with fraud, and other blather when discussing things with you.

      That is if that person meets a variety of tests, such as "this person is doing a job assigned by corp $x, and it is reasonable to see them as a representative". Salesman is definitely there.

      You aren't bound by a manual, FYI, unless you sign some agreement. It's not a contract.

      • (Score: 1, Insightful) by Anonymous Coward on Sunday November 04 2018, @12:20PM (5 children)

        by Anonymous Coward on Sunday November 04 2018, @12:20PM (#757580)

        Some recording or whatever. Every luser who failed to RTFM, goes to blame everything and everyone but himself; words of mouth from such a source mean nothing.

        • (Score: 0) by Anonymous Coward on Sunday November 04 2018, @02:51PM (3 children)

          by Anonymous Coward on Sunday November 04 2018, @02:51PM (#757621)

          It doesn't matter what the manual says, it's called "autopilot" which in the minds of the general public means exactly what the plaintiff is alleging he was told by the salesperson.

          Here on the site we know better, but let's be honest about the fact that the prefix "auto" has a meaning and "pilot" does as well. The term really means self drive, but in other contexts there's a slightly different set of expectations placed on those systems. Those systems also are not used by people without a ton of specialized training to make sure they can operate the vehicle if the autopilot can't and to know what the autopilot can and can't do.

          • (Score: 1) by Sulla on Sunday November 04 2018, @03:06PM (2 children)

            by Sulla (5173) on Sunday November 04 2018, @03:06PM (#757627) Journal

            minds of the general public

            These are people who can afford to buy Teslas

            --
            Ceterum censeo Sinae esse delendam
            • (Score: 5, Insightful) by number11 on Sunday November 04 2018, @03:56PM (1 child)

              by number11 (1170) Subscriber Badge on Sunday November 04 2018, @03:56PM (#757640)

              These are people who can afford to buy Teslas

              Minimal observation shows that having money is not an indicator of intelligence or competence.

              • (Score: 1, Insightful) by Anonymous Coward on Monday November 05 2018, @12:46AM

                by Anonymous Coward on Monday November 05 2018, @12:46AM (#757798)

                In fact, money seems to be inversely proportional to intelligence and competence.

        • (Score: 2) by urza9814 on Monday November 05 2018, @07:41PM

          by urza9814 (3954) on Monday November 05 2018, @07:41PM (#758158) Journal

          Fuck that, let's set some precedents that allow suing companies into oblivion for the blatant scams that their sales teams run. Maybe then companies will start to think twice about promising *fucking everything, even when none of it exists yet* just to win a contract...

      • (Score: 2) by Spamalope on Sunday November 04 2018, @01:55PM (2 children)

        by Spamalope (5233) on Sunday November 04 2018, @01:55PM (#757604) Homepage

        Dealers won't sell the car without a sales contract saying basically 'we can do the opposite of everything the salesman promised and you agree that it's your problem as a condition of the sale'. It's always in there.

        Also - if the salesman promises anything about the car - make them put it in the vehicle description on the sales contract - or it's a lie. (ex: Car smashed in accident while being transported from the factory, repaired badly and sold as new - blows broken glass fragments into eye of buyer - contract doesn't say unrepaired... sigh)

        • (Score: 0) by Anonymous Coward on Sunday November 04 2018, @02:54PM

          by Anonymous Coward on Sunday November 04 2018, @02:54PM (#757622)

          That's why we need to stop electing and appointing incompetent judges. How the dealership interprets the contract isn't the legal standard, it's how the 2nd party interprets it. If there's more than one interpretation, it's supposed to be the 2nd party's interpretation that is used, not the party that wrote the freaking contract. And it's like that specifically because of things like this.

          You shouldn't have to be a lawyer to avoid being ripped off by the fraudulent behavior of dealers.

        • (Score: 2) by Whoever on Sunday November 04 2018, @06:02PM

          by Whoever (4524) on Sunday November 04 2018, @06:02PM (#757683) Journal

          Dealers won't sell the car without a sales contract saying basically 'we can do the opposite of everything the salesman promised and you agree that it's your problem as a condition of the sale'. It's always in there.

          It's called an "integrated contract".

    • (Score: 2) by VLM on Sunday November 04 2018, @02:09PM (1 child)

      by VLM (445) on Sunday November 04 2018, @02:09PM (#757608)

      The good news is verbal contracts are a thing.

      The bad news is they have umpty bazillion sales droids and it should be easy to record one in a one-party recording state at their job making ridiculous promises to make a sale.

      • (Score: 2) by Whoever on Sunday November 04 2018, @03:31PM

        by Whoever (4524) on Sunday November 04 2018, @03:31PM (#757635) Journal

        The good news is verbal contracts are a thing.

        So are integrated contracts.

    • (Score: 5, Insightful) by theluggage on Sunday November 04 2018, @10:28PM (4 children)

      by theluggage (1797) on Sunday November 04 2018, @10:28PM (#757753)

      Seems to me that Tesla are getting away with it because they can always blame the driver for ignoring warnings. What they're doing is still stupid and dangerous - if you sell something as "autopilot" of course people are going to treat it like a Hollywood airplane get-up-and-have-a-fist-fight autopilot.

      Maybe a third party who suffered damages from an out-of-control Tesla would have better luck suing them - they wouldn't be negligent for ignoring Tesla's small print.

      If I were inclined to conspiracy theories I'd be tempted to think that the whole "self-driving car" thing was dreamed up by Big Oil and the Lizard People as a tar baby for the whole electric car industry. The technology may be 95% there, but we all know that the last 5% takes 95% of the time - and a 95% self-driving car that relies on the driver staying alert to cope with the 5% is a recipe for carnage.

      • (Score: 2) by boltronics on Monday November 05 2018, @02:32AM (1 child)

        by boltronics (580) on Monday November 05 2018, @02:32AM (#757821) Homepage Journal

        Agreed, Tesla shouldn't be selling it they way they allegedly did.

        I would also argue that going onto the road in a vehicle that has the strong possibility of killing people in the event of a collision without so much as bothering to read the manual, is at least as bad.

        --
        It's GNU/Linux dammit!
        • (Score: 2) by theluggage on Monday November 05 2018, @08:14PM

          by theluggage (1797) on Monday November 05 2018, @08:14PM (#758172)

          I would also argue that going onto the road in a vehicle that has the strong possibility of killing people in the event of a collision without so much as bothering to read the manual, is at least as bad.

          True. I don't know why some people seem to think that, in order to place blame on one party, you have to take it away from someone else. There's always more than enough for everybody...

          Of course, since every instruction manual is now 90% condescendingly obvious safety warnings that treat the most modest of domestic implements as a potential Weapon of Mass Destruction* I have a small quantum of sympathy for someone who might have missed the one saying "Warning - don't expect the autopilot to automatically pilot your car".

          (* Warning - do not operate the Megadeath Omega 666 Doomsday Bomb with wet hands. Always seek the land owner's permission before reducing their continent to a radioactive plain of fused glass.)

      • (Score: 2) by darkfeline on Tuesday November 06 2018, @04:10AM (1 child)

        by darkfeline (1030) on Tuesday November 06 2018, @04:10AM (#758368) Homepage

        How many of these people would ride a plane where the pilots were busy not paying attention with the autopilot engaged? Yeah, no, that excuse doesn't fly.

        --
        Join the SDF Public Access UNIX System today!
        • (Score: 2) by theluggage on Wednesday November 07 2018, @12:26AM

          by theluggage (1797) on Wednesday November 07 2018, @12:26AM (#758762)

          How many of these people would ride a plane where the pilots were busy not paying attention with the autopilot engaged?

          Er... most of them? After all, that's how autopilots work in the movies, although I do think that one where the inflatable pilot appears and starts smiling after he gets, er, personal service from the stewardess might have been slightly inaccurate.

          Like 95% of people, I'm not a pilot - I don't personally know, or have any reason to know, exactly what the constraints are on the use of autopilots in aviation although I'm pretty sure you don't need to keep your hands on the controls. Airplane! jokes aside, I've seen plenty of "serious" movies and TV in which autopilots are shown flying planes unattended, and heard references to modern airliners being able to land and take off by themselves. I've seen at least one documentary where a pilot picks up a clipboard and fills out a checklist while the autopilot flies the plane. Now, it just so happens that I'm a skeptical bastard who starts with the assumption that if its on a TV or movie screen then its a lie even if it says "documentary" on the title. Others are more skeptical (some start letter writing campaigns to get fictitious characters in or out of jail).

          I guarantee that a significant proportion of the population are absolutely certain that its perfectly fine to leave a 747 on autopilot while you make a coffee.

          50% of people are below median* intelligence.

          Also, it doesn't even matter if you know that you're meant to pay attention and be ready to intervene - without the physical action of driving to keep you on-task many people will just zone out. For the moment, its much better to have the computer acting as a second pair of eyes for the driver than vice-versa.

          Oh, and I wouldn't get on a plane if it was continually passing within a few feet of other planes flying in the opposite direction while being tailgated by a large white German plane driven by a caffeine-addled stockbroker who was going to rear-end you when you suddenly braked to avoid a helicopter that was inexplicably stopped in the middle of the flight path. I'm not sure the concept of an airplane autopilot even applies to driving.

          (* including everybody who just assumes that every distribution is symmetrical)

  • (Score: 5, Informative) by bradley13 on Sunday November 04 2018, @07:29AM (23 children)

    by bradley13 (3053) on Sunday November 04 2018, @07:29AM (#757523) Homepage Journal

    For anyone who hasn't looked into this: Apparently all car navigation aids consider stationary objects to be part of the scenery, at least when on a highway. Lots of stationary objects are whizzing by, too many to process, so they are ignored. Stationary objects in the middle of a highway do not exist.

    So when they, in fact, do exist, well, this happens...

    Odd, actually. This seems like a simple thing to fix: ignore stationary objects unless they are directly ahead, when they suddenly become rather important.

    --
    Everyone is somebody else's weirdo.
    • (Score: 2, Interesting) by ataradov on Sunday November 04 2018, @07:43AM (13 children)

      by ataradov (4776) on Sunday November 04 2018, @07:43AM (#757525) Homepage

      Safety railings are also directly ahead in the tight turns. I'm not sure if there are such tight turns on the highways, but there definitely are on off-ramps and highway interchanges.

      • (Score: 4, Interesting) by Runaway1956 on Sunday November 04 2018, @09:40AM (10 children)

        by Runaway1956 (2926) Subscriber Badge on Sunday November 04 2018, @09:40AM (#757548) Journal

        I'm not sure if there are such tight turns on the highways,

        It varies wildly. Interstate highways have very high standards, meaning you'll almost never see such tight turns. But, drive I-40 across N. Carolina, and you'll see them! US Highways also have pretty high standards, a little lower than Interstates. But, in the Carolinas and West Virginia, some other places, you'll see curves so tight you begin to wonder if you're looking at your own taillights ahead of you. Lesser highways - anything goes.

        What's certain is, when the car sees something directly ahead, if it masses more than a pound or two, the damned thing should BRAKE!! The more massive the object, the harder the braking should be. Any human moron can run into a granite face of a mountain - God knows I've seen it. We expect the programmers to be able to do better.

        • (Score: 2) by Runaway1956 on Sunday November 04 2018, @09:41AM

          by Runaway1956 (2926) Subscriber Badge on Sunday November 04 2018, @09:41AM (#757549) Journal

          and . . . I messed up that quote tag . . .

        • (Score: 2) by suburbanitemediocrity on Sunday November 04 2018, @10:26AM (1 child)

          by suburbanitemediocrity (6844) on Sunday November 04 2018, @10:26AM (#757560)

          I've seen them too in more rural areas. I think Texas.

          • (Score: 1, Funny) by Anonymous Coward on Sunday November 04 2018, @11:57AM

            by Anonymous Coward on Sunday November 04 2018, @11:57AM (#757577)

            Texas is not rural, it's polite wilderness.

        • (Score: 3, Informative) by number11 on Sunday November 04 2018, @04:08PM (4 children)

          by number11 (1170) Subscriber Badge on Sunday November 04 2018, @04:08PM (#757643)

          I'm not sure if there are such tight turns on the highways,

          It varies wildly. Interstate highways have very high standards, meaning you'll almost never see such tight turns. But, drive I-40 across N. Carolina, and you'll see them!

          The first time you take I-90 through downtown Cleveland OH, you discover that the sign painter wasn't exaggerating when painting the sign showing a 90 degree turn.

          • (Score: 3, Interesting) by Runaway1956 on Sunday November 04 2018, @05:21PM (3 children)

            by Runaway1956 (2926) Subscriber Badge on Sunday November 04 2018, @05:21PM (#757666) Journal

            Dead Man's Curve. It was snowing, the lake was doing it's best to blow up onto the interstate - it was a miserable day. Traffic was moving about 20 mph, some braver souls were doing maybe 25. Some dumbass in a brand new Toyota Landcruiser came flying into the curve at about 50 mph or so. He learned the hard way. Few of us have seen that curve, I would imagine. The speed limit is 35 under the best of conditions. Hey - it's on Youtube! https://www.youtube.com/watch?v=M8otLNgtyGE [youtube.com]

            • (Score: 2) by number11 on Sunday November 04 2018, @10:24PM

              by number11 (1170) Subscriber Badge on Sunday November 04 2018, @10:24PM (#757749)

              Yup. If you survive your first encounter with that curve, you treat it with great respect after that.

            • (Score: 0) by Anonymous Coward on Monday November 05 2018, @06:38PM (1 child)

              by Anonymous Coward on Monday November 05 2018, @06:38PM (#758121)

              what kind of dumb ass puts a 90 on a highway...

              • (Score: 2) by Runaway1956 on Monday November 05 2018, @07:37PM

                by Runaway1956 (2926) Subscriber Badge on Monday November 05 2018, @07:37PM (#758156) Journal

                Politicians, of course. That highway could have taken any route at all through Cleveland - short of running it through the courthouse. Eminent domain was used in many cases when the Interstate Highways were built. They could have mowed down any neighborhood that wasn't home to the city's wealthiest, and most influential people. They could have bulldozed right through any, or all, of the ethnic minority neighborhoods. But, someone wanted that shoreline veiw, or some such nonsense. And, it was probably the least valuable property in the city. Anything built there is going to flood. Cleveland doesn't get the lake effect snow that Buffalo gets, but it's the same lake, with pretty much the same weather patterns. The lake does try really really hard, at times, to submerge that stretch of highway. Maybe it succeeds sometimes, and I wasn't around to see it.

        • (Score: 0) by Anonymous Coward on Sunday November 04 2018, @07:18PM (1 child)

          by Anonymous Coward on Sunday November 04 2018, @07:18PM (#757702)

          What's certain is, when the car sees something directly ahead, if it masses more than a pound or two, the damned thing should BRAKE!!

          Good idea. Now how do you determine the mass of the object in front of the car?

          • (Score: 2) by Runaway1956 on Monday November 05 2018, @03:04AM

            by Runaway1956 (2926) Subscriber Badge on Monday November 05 2018, @03:04AM (#757834) Journal

            Camera, radar, laser, and maybe a spectrometer. If the computer isn't sure, you burn a hole in it to see what it's composed off. Hmm - what does cat look like with a spectrometer?

      • (Score: 2) by captain normal on Sunday November 04 2018, @05:32PM (1 child)

        by captain normal (2205) on Sunday November 04 2018, @05:32PM (#757674)

        Pretty much any curve is pretty tight at 80 mph.

        --
        When life isn't going right, go left.
        • (Score: 0) by Anonymous Coward on Sunday November 04 2018, @07:31PM

          by Anonymous Coward on Sunday November 04 2018, @07:31PM (#757704)

          Which is 10mph over the legal limit in Florida.

    • (Score: 0, Troll) by Anonymous Coward on Sunday November 04 2018, @07:47AM (1 child)

      by Anonymous Coward on Sunday November 04 2018, @07:47AM (#757526)

      It is not an error. They know such things exist IRL. They make systems to take your money and enslave you and make you like your enslavement. The people who give money for autopilot systems prefer to be chained to their desks while doing repetitive tasks.

      If you want autopilot then pay for a driver. Failing that, ride the bus.

      • (Score: 0) by Anonymous Coward on Monday November 05 2018, @02:26AM

        by Anonymous Coward on Monday November 05 2018, @02:26AM (#757818)

        You're talking about people who have themselves mistaken for petty bourgeoisie, when they have in fact entered into a voluntary slavery agreement with a capitalist?

    • (Score: 3, Informative) by deimtee on Sunday November 04 2018, @09:27AM (3 children)

      by deimtee (3272) on Sunday November 04 2018, @09:27AM (#757541) Journal

      If it was simple to fix, they would. As the arstechnica article points out, if it tried to operate at highway speeds it would be emergency braking for overhead signs, bridges and anything near a curve in the road.
      They need to make it clearer that at speeds over 20 or 30 mph, auto-braking only avoids the vehicle traveling directly in front of you.

      --
      If you cough while drinking cheap red wine it really cleans out your sinuses.
      • (Score: 2) by Runaway1956 on Sunday November 04 2018, @09:47AM (1 child)

        by Runaway1956 (2926) Subscriber Badge on Sunday November 04 2018, @09:47AM (#757553) Journal

        A finely focused visible light analyzer would do wonders here. The aperture just doesn't see high enough to see overhead signs, or bridge structures. Curves in the road would take a little more thinking and experimentation, sure. But, police can use cameras to measure your speed. That camera is hooked up directly to a computer, which should be able to do the same thing. Anything approaching your vehicle at speed should be braked for, or evaded.

        Multiple sensors are present already, right? Just throw in a regular camera, suitable for measuring speed, and compare that sucker to all the other sensors onboard - radar(s), infrared, whatever.

        The automaker can't just throw up their hands, and say, "Well, it's good enough for our test track!"

        • (Score: 0) by Anonymous Coward on Sunday November 04 2018, @05:59PM

          by Anonymous Coward on Sunday November 04 2018, @05:59PM (#757681)

          Tesla only has cameras, they are not laser radars, not capable of detecting any more that human eyes can see. But they are lacking the human brain to process all that data. Overall, Tesla's Autopilot is a deadly invention, one of those that lure you into feeling safe, and then BAM!

          I always thought that the main purpose if a car safety system is to stop it before it hits anything. The Autopilot, after seeing the leading car leaving the lane (stupid box of bolts and chips, there is a reason for that!) only knows to accelerate the Tesla to maximum speed. But any driver can do it easily and with pleasure! Why then to pay for Autopilot if any modern car will safeguard you better, as it has radars and brakes before obstacles?

      • (Score: 0) by Anonymous Coward on Sunday November 04 2018, @02:58PM

        by Anonymous Coward on Sunday November 04 2018, @02:58PM (#757624)

        This isn't a particularly hard thing to solve, it's just that Teslas are cheaply made and that they're not making use of the technology available that would be able to handle that kind of situation without constantly jumping on the brakes.

        A real system can tell if something is stationary in the path of travel, above the path of travel or moving into the path of travel. Tesla is trying to do all this stuff with cheap technology when there's a more expensive technology that doesn't have these kinds of problems and they definitely deserve to be sued for the reckless misrepresentation of what the technology can do. If it can't handle a stationary object, then it shouldn't be on the road period as that's the main time that you're going to need to slam on the brakes.

    • (Score: 2) by Whoever on Sunday November 04 2018, @06:05PM (1 child)

      by Whoever (4524) on Sunday November 04 2018, @06:05PM (#757684) Journal

      I can tell you from personal experience that your statement is not accurate.

      The collision warning on my car kicks in on stationary objects quite frequently. We have turned its sensitivity down to the minimum to prevent invalid warnings about stationary objects.

      • (Score: 2) by mobydisk on Monday November 05 2018, @12:50PM

        by mobydisk (5472) on Monday November 05 2018, @12:50PM (#757956)

        The fact that it has a user-configurable sensitivity means that it isn't ready for prime time. That should be a developer-only tweak.

    • (Score: 2) by All Your Lawn Are Belong To Us on Monday November 05 2018, @07:12PM

      by All Your Lawn Are Belong To Us (6553) on Monday November 05 2018, @07:12PM (#758139) Journal

      But if they take on the responsibility to fix the problem of objects straight ahead, then they also buy into fixing the problem of objects in mid-lane while traveling along a slight curve. A human being can recognize that the centerline changes as one curves... but will the computer running the car?

      --
      This sig for rent.
  • (Score: 1, Insightful) by Anonymous Coward on Sunday November 04 2018, @07:36AM (1 child)

    by Anonymous Coward on Sunday November 04 2018, @07:36AM (#757524)

    The vehicle autopilot scam is meant to give away more control to machines, which are controlled by your enemies that you never get to see.

    After the complete failure of autopilot (as expected), they will come up with remote satellite control of your car, centrally controlled by the system. They will install cameras and a microphone inside the car cabin just for safety and improving customer experience*.

    *Improving customer experience has been a very effective indiscriminate surveillance mechanism by the corrupt system

    • (Score: 0) by Anonymous Coward on Sunday November 04 2018, @08:24PM

      by Anonymous Coward on Sunday November 04 2018, @08:24PM (#757714)

      Skynet

  • (Score: 2) by Username on Sunday November 04 2018, @07:50AM (10 children)

    by Username (4557) on Sunday November 04 2018, @07:50AM (#757527)

    Don't these cars have radar and auto braking? Maybe we should be removing the mandate for these system, since it seems all they do is increase car prices without adding protection.

    • (Score: 2, Informative) by Anonymous Coward on Sunday November 04 2018, @09:20AM (9 children)

      by Anonymous Coward on Sunday November 04 2018, @09:20AM (#757540)

      There are a few too many links in the summary to R all TFAs, but you should read this one to understand why :
      https://arstechnica.com/cars/2018/06/why-emergency-braking-systems-sometimes-hit-parked-cars-and-lane-dividers/ [arstechnica.com]

      • (Score: 3, Interesting) by Runaway1956 on Sunday November 04 2018, @10:04AM (8 children)

        by Runaway1956 (2926) Subscriber Badge on Sunday November 04 2018, @10:04AM (#757557) Journal

        Well that link make is obvious that systems are separate. And, they shouldn't be. All of the sensing system should be integrated, with one central computer controlling them. All this time, I've pretty much assumed that radar, cameras, laser, lidar, and whatever else were all tied together. You will never have a "safe" system, until they are all integrated and calibrated together. Overhead signs, for instance - they can show up on radar, but if the camera, the laser, and all other sensors report the road is empty, then the central computer can logically decide to ignore that particular radar contact. If the camera reports that something is in the same spot that the radar is pinging, then the car should slow down, or even stop.

        Also - the article implies that emergency braking always spikes the brakes. FFS - the engineers aren't smart enough to just slow a vehicle sometimes? We all have to take evasive action, routinely. We don't all of us spike the brakes hard, each and every time we want to avoid some hazard! Often times, just taking your foot off the accelerator suffices to avoid some hazard. If not, touch the brakes, and just let them drag a couple of seconds. The driver adjusts all thoughout the slowing maneuver, it's not all or nothing!

        • (Score: 2) by G-forze on Sunday November 04 2018, @10:48AM

          by G-forze (1276) on Sunday November 04 2018, @10:48AM (#757564)

          Also, since many of the crashes have been caused by the driver not actually paying attention, I'm guessing the car slowing down would rouse the driver from whatever he is doing so that he could take appropriate action. And since slowing down a bit is not a very risky thing to do, it could be done well in time if the computer detects something it cannot clearly classify, just to be sure.

          --
          If I run into the term "SJW", I stop reading.
        • (Score: 3, Informative) by Nuke on Sunday November 04 2018, @10:49AM (5 children)

          by Nuke (3162) on Sunday November 04 2018, @10:49AM (#757565)

          All of the sensing system should be integrated, with one central computer controlling them.

          Nope. That's having a single point of failure mode. I can assure you that is not done in serious industrial safety systems.

          • (Score: 3, Insightful) by Runaway1956 on Sunday November 04 2018, @11:03AM (2 children)

            by Runaway1956 (2926) Subscriber Badge on Sunday November 04 2018, @11:03AM (#757570) Journal

            So, have a redundant backup to the central computer, if necessary. The point is, something needs to integrate all the input, before it can make any kind of logical output. Your hearing may seem independent of your sight, and your sense of feel, but often times your hearing or your sense of touch draws your sense of vision to an event that requires closer examination.

            I don't care if the radar passes through one evaluation system, on it's way to the central computer. And visible light cameras can pass through it's own processor, before it gets to the central controller. But, setting up all these subsystems so that they don't share data is never going to work.

            The radar controller may well panic at some contact - but that's alright, if within milliseconds it gets a message, "Ignore the contact bearing 001 degrees at 300 yards because it is an expected stationary hazard on a curve." The camera system can panic over something else in the roadway, but the laser assures the master system that the obstacle is actually just heat waves. All systems should accept double-check data from the other systems.

            I don't care how we might choose to work around a "single point of failure", but I'm certain that it can be done. Bottom line, at present, radar IS a "single point of failure". It can't be relied on, but people insist on relying on it.

            • (Score: 3, Interesting) by Immerman on Sunday November 04 2018, @06:23PM

              by Immerman (3985) on Sunday November 04 2018, @06:23PM (#757688)

              I partially agree - it makes sense to give the autopilot access to all the data. However, the entire point of having a backup safety system (autobraking or whatever) is to prevent a failure by the driver (in this case autopilot) from killing you - to do that, it must NOT allow the autopilot to override. If the emergency braking system freaks out and thinks it needs to brake, the car brakes. If the autopilot thinks it's wrong, then it needs to navigate the car in a manner that avoids triggering the emergency braking system.

              Otherwise having the emergency braking system at all is pointless - it sees an oncoming collision and starts to brake, milliseconds later the imperfect autopilot overrules it, and you end up slamming into the parked car anyway.

            • (Score: 2) by Username on Monday November 05 2018, @12:25AM

              by Username (4557) on Monday November 05 2018, @12:25AM (#757791)

              I think he's talking about all the sensors sharing a data line when you used the term integrated. As in combined into one part. He's probably a mac person, since this problem is common on macbooks and airbooks, etc. They have temp sensors on the same line as the light sensor on the camera, so when that fails and grounds out, or you get a speck of shit on a connector, it take down the signal from the other sensors. Making the macbook fan runs 100% and throttles cpu multiplier to base to prevent thermal damage from not successfully polling the temp sensor.

              Doesn't even have to be linked up, it just has to work. Something in road, hit the brakes. Autopilot, like cruse control, should turn off when brakes are engaged. Now if he turned the braking feature off due to nuisance braking, it's still the features fault. Should work or not have it at all. It's like having a seatbelt that rips apart when any kind of force is applied. He should sue not only tesla but the manufacturer of the braking system and the legislator who added the mandate.

          • (Score: 0) by Anonymous Coward on Monday November 05 2018, @10:54AM

            by Anonymous Coward on Monday November 05 2018, @10:54AM (#757928)

            Current "self-driving" cars are a far cry from industrial safety systems.

          • (Score: 2) by mobydisk on Monday November 05 2018, @01:01PM

            by mobydisk (5472) on Monday November 05 2018, @01:01PM (#757964)

            What they have now is *worse* than having a single point of failure. They have multiple points of failure. There is a difference between "multiple computers" and "redundant computers."

            What they have now is 2 systems that don't communicate, so if either one fails the entire system fails. Lane changing computer fails? CRASH. Automatic cruise control system fails? CRASH. By integrating both into one, you reduce the likelihood of failure, plus the computer has the benefit of all the sensor data instead of one. So once they integrate them and update their algorithms, then add a redundant computer. Of course, a redundant computer is useless unless there are redundant sensors, and a redundant control system from those computers, and redundant wires to those sensors, and a system for arbitrating the redundant computers... and this is why industrial and medical devices can be very expensive.

            P.S. I work on industrial medical safety systems.

        • (Score: 3, Informative) by Whoever on Sunday November 04 2018, @06:15PM

          by Whoever (4524) on Sunday November 04 2018, @06:15PM (#757685) Journal

          Well that link make is obvious that systems are separate.

          No, you are misreading the article. It says that the systems are separate on many vehicles, but does not say that they are separate on Teslas.

          Also - the article implies that emergency braking always spikes the brakes.

          Teslas don't always. The car gives an audible warning first.

          In fact that article is rather poorly researched. It discusses the limitations of radar, but radar is only used as a secondary input on Teslas. [tesla.com]

  • (Score: 2) by Nuke on Sunday November 04 2018, @10:44AM (7 children)

    by Nuke (3162) on Sunday November 04 2018, @10:44AM (#757563)

    SD Car fans will keep saying that computers make safer drivers than average humans. That ignores two things :-

    1) It assumes an average and most of those fans are are in the USA so assume USA averages. But the USA (mainly human) accident statistics are appalling compared with UK ones (about 5 times more per journey-mile despite far less crowded roads). Then even by UK statistics I am a safer driver than average, having had zero crashes while the average rate is > zero.

    2) If I die in a crash I'd rather it be my fault (or at least have been in a position to do something about it) than it be the fault of some programmer somewhere who failed to take into account the situation that killed me.

    • (Score: 5, Funny) by G-forze on Sunday November 04 2018, @10:51AM

      by G-forze (1276) on Sunday November 04 2018, @10:51AM (#757566)

      If you died in a crash, you wouldn't really care who's fault it was, since you'd be, you know, dead. :P

      --
      If I run into the term "SJW", I stop reading.
    • (Score: 4, Funny) by isostatic on Sunday November 04 2018, @10:51AM

      by isostatic (365) on Sunday November 04 2018, @10:51AM (#757567) Journal

      Everyone’s a safer driver than average until they have one crash.

      Every SD car is safer than average until it crashes

    • (Score: 2, Troll) by VLM on Sunday November 04 2018, @02:17PM (2 children)

      by VLM (445) on Sunday November 04 2018, @02:17PM (#757612)

      3) there's a lot of personal decision about risk taking thats asymmetric. So accident rates in the USA are high because of 2am drunks and illegal alien unlicensed driver areas and so forth, so the average risk in the USA means something, but my chosen lifestyle results in a much lower risk. With self crashing cars (don't do the PR thing and call them self driving... they're self crashing cars...) the risk is mysterious and maybe around the USA average but I'm not USA average so its WAY riskier for me.

      Its kinda like deciding to legalize asbestos because the lung cancer risk is lower than the USA average of a mixed multicultural population of 4-pack-a-day boomers and non-smokers. Sure, maybe the asbestos lung cancer risk is lower than the risk for a hypothetical average 2-pack a day smoker who doesn't exist, but that means the non-smokers should be up in arms and marching with pitchforks.

      If you don't drink and drive, under the limit or not, and if you don't use drugs, if you're not half blind, etc etc in other words if you're an average "safe" driver, then don't accept the safety stats being merely as good as the average known to be shitty driver.

      • (Score: 1, Insightful) by Anonymous Coward on Sunday November 04 2018, @03:03PM

        by Anonymous Coward on Sunday November 04 2018, @03:03PM (#757625)

        If we're going to claim that these AI cars are safer than ones driven by people, we really should be excluding those sorts of things from the statistics. It's not hard to make a car that's safer than a drunk, high, asleep or somebody who spends half their time looking at a cellphone while driving.

        We also need to consider the fact that these cars are still only acceptable-ish when driven in good conditions and as the article notes, Teslas can't handle stationary objects next to the roadway. Which is something that pretty much anybody who drives a car and isn't in one of the aforementioned groups handles routinely without any drama.

      • (Score: 2) by Nuke on Sunday November 04 2018, @04:25PM

        by Nuke (3162) on Sunday November 04 2018, @04:25PM (#757652)

        Its kinda like deciding to legalize asbestos because the lung cancer risk is lower than the USA average of a mixed multicultural population of 4-pack-a-day boomers and non-smokers.

        I think a better analogy to making us use SD cars is making everyone smoke a cigarette every day because it is safer than the national average (across smokers and non-smokers) of two per day (or whatever the figure is).

    • (Score: 5, Interesting) by stretch611 on Sunday November 04 2018, @02:42PM

      by stretch611 (6199) on Sunday November 04 2018, @02:42PM (#757617)

      But the USA (mainly human) accident statistics are appalling compared with UK ones (about 5 times more per journey-mile despite far less crowded roads).

      It is very hard to make comparisons like this. Even comparisons from one part of the US to another are hard to make. Yes, the US has miles and miles of interstates that are not crowded... However, that is not were the majority of people drive. (obviously)

      In Atlanta, you have 3 major interstates going through the city... I-85, I-75, and I-20. I-85 starts NE of the city and goes SW, I-75 starts NW of the City and goes South crossing over 85 like an elongated X with both of them merged in the middle. I-20 goes East and West of the City crossing over the merged part of I-75/85. (also we have I-285 which literally is a odd shaped circle around the whole city.

      In the Heart of Atlanta, where I-75 and I-85 are merged, we have 16 lanes of traffic... 8 in each direction. Traffic is so bad, it literally crawls to a stop during rush hour. Even off hours and weekends can see traffic come to a virtual stop. If you think you have seen traffic before, it takes on a different meaning when you have 8 lanes of interstate (which is controlled access of course) and you are not moving along with people in the other 7 lanes, and then you take a peek at the other side and see that those 8 lanes aren't moving either so it doesn't matter if you are coming or going. And the kicker is that it is only cars in all 16 lanes... trucks are not allowed inside the city on the interstate unless they specifically have a drop off there(even then they are usually avoiding rush hour and deliver at night), they are forced to use I-285 and circumnavigate around the city and even that can be just as slow despite 6 or 8 lanes in both directions.

      That is a crap-ton of traffic... especially if you consider that Atlanta is pretty much land locked and there are plenty of surface streets other than the interstates including roads like Highway 41 and Route 400 (which are traffic nightmares in their own right.)

      Then you can look at New York City... An island. Where Atlanta has hundreds of roads going in and out of it, you have no choice but to use a bridge or tunnel to get into NYC. While the George Washington Bridge has 14 lanes of traffic (only 2 fewer than I-85/75 in Atlanta) It gets hammered because there are only 3 ways to/from NYC from New Jersey. The Lincoln Tunnel only has a total of 6 lanes, and the Holland Tunnel only has 4 lanes of traffic. So for all the cars going back and forth from NJ to NYC, a total of 24 lanes carries every single car. It is no wonder that this is one of the worse commutes due to traffic in the world. And even politics has made the traffic worse [wikipedia.org] at times.

      Another thing to consider is that all 3 bridges and tunnels from NJ to NYC are toll roads... there are few things that make traffic worse than forcing everyone to slow down and stop to pay a toll.

      As bad as these two areas are for high traffic, both are generally accepted to be easier commutes than Boston and Los Angeles. (which I have never experienced)

      Please do not tell me that our driving is worse here in the US because are roads are less crowded... We have a lot of roads, and built plenty of interstates to criss-cross the country, but the ones that we overwhelmingly use are the ones by the big cities, not by the cornfields in the midwest.

      --
      Now with 5 covid vaccine shots/boosters altering my DNA :P
    • (Score: 1) by ChrisMaple on Monday November 05 2018, @07:36PM

      by ChrisMaple (6964) on Monday November 05 2018, @07:36PM (#758155)

      Part of the U.K.'s superior automobile safety record is due to the fact that the drivers' test required to get a license in the U.K. is rigorous. You have to know the rules and be able to abide by them flawlessly.
      People in England are more polite on average, and that helps a lot.

  • (Score: 3, Insightful) by Anonymous Coward on Sunday November 04 2018, @10:59AM (1 child)

    by Anonymous Coward on Sunday November 04 2018, @10:59AM (#757568)

    If you cant trust it to do normal things like avoiding stationary concrete walls, why even have the thing?

    • (Score: 0) by Anonymous Coward on Monday November 05 2018, @05:08AM

      by Anonymous Coward on Monday November 05 2018, @05:08AM (#757855)

      > If you cant trust it to do normal things...

      Well, a cynic would say that you must enjoy being a beta tester (although your role in the testing process may not have been divulged to you). Note, sometimes beta testers are called by another name--early adopters.

  • (Score: 1) by Sulla on Sunday November 04 2018, @02:45PM (3 children)

    by Sulla (5173) on Sunday November 04 2018, @02:45PM (#757618) Journal

    This to me sounds like the people who sued red bull because it didn't give them wings. When listening to Tesla PR they tout how great it is but they also are pretty careful with their words. Musk says that the cars are "getting better" and "the best out there", that does not translate to "perfect self-driving machine". When you are in a vehicle flying down the road at 55mph you are still responsible for paying attention. It is fine to let an autopilot take care of most of the work for you but you should still have your eyes on the road.

    I did not rtfa but what was this guy doing when he as driving the car? Was he just watching as the crash happened rather than brake himself? Was he watching and hoping the car would brake for him? Was he just surfing the net and not paying attention to the road? I believe regardless of what the folks at Tesla says, there are probably regulations in California regarding what you must/must not be doing when you are driving.

    --
    Ceterum censeo Sinae esse delendam
    • (Score: 3, Funny) by number11 on Sunday November 04 2018, @04:19PM (1 child)

      by number11 (1170) Subscriber Badge on Sunday November 04 2018, @04:19PM (#757650)

      I did not rtfa but what was this guy doing when he as driving the car?

      From tfa:

      Hudson says he was "relaxing during his commute"

      That may be a euphemism for "taking a nap".

    • (Score: 3, Interesting) by legont on Sunday November 04 2018, @04:41PM

      by legont (4179) on Sunday November 04 2018, @04:41PM (#757656)

      He should claim that the mere existence of the autopilot permanently reduced his alertness and ability to drive; which is most likely true for the whole population.

      --
      "Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
  • (Score: -1, Offtopic) by Anonymous Coward on Sunday November 04 2018, @07:00PM

    by Anonymous Coward on Sunday November 04 2018, @07:00PM (#757699)

    elon musk is the richest african-american.

  • (Score: 3, Insightful) by JustNiz on Sunday November 04 2018, @09:53PM

    by JustNiz (1573) on Sunday November 04 2018, @09:53PM (#757736)

    It's obvious you're gonna cause accidents and kill people when you blatantly misrepresent an intelligent cruise control by simply calling it an autopilot.
    also misrepresenting that your cars can drive themselves in all conditions.

    https://www.tesla.com/autopilot [tesla.com]

    "All Tesla vehicles produced in our factory, including Model 3, have the hardware needed for full self-driving capability at a safety level substantially greater than that of a human driver."

    Subtle huh? they actually said they have the hardware needed. It actually doesn't mention if they have the software to, or whether they actually can, but most people will obviously read it as exactly that.

(1)