Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by martyb on Saturday September 07 2019, @08:06AM   Printer-friendly
from the drive-to-make-drivers-drive dept.

Arthur T Knackerbracket has found the following story:

The driver of this Model S was found to have only had his hands on the wheel for 51 seconds of the last 13 minutes 48 seconds of his trip.

One of the more highly publicized Tesla crashes in recent memory involved a man in Los Angeles plowing his Tesla Model S into the back of a fire truck. The car wasn't going all that fast and thankfully nobody was hurt, but it was a fairly gnarly crash nonetheless.

Part of the government's investigation into the crash involved finding out whether or not Tesla's Autopilot system had been engaged at the time of the collision and if so, determining whether or not the driver was paying attention to what was going on around them.

Well, it's been a while, but the National Transportation Safety Board (NTSB) has released part of the findings of its inquiry according to a Tweet published by the agency on Tuesday and hey, guess what? The driver of the vehicle was found only to have had his hands on the wheel as prescribed by Tesla (and good sense) for 51 seconds of the final 13 minutes and 48 seconds of the drive. Even worse, his hands weren't on the wheel at all for the last 3 minutes and 41 seconds before the crash.

When questioned by the NTSB as to what he was doing at the time of the crash, the driver stated, "I was having a coffee and a bagel. And all I remember, that truck, and then I just saw the boom in my face and that was it."

Clearly, there was a breakdown in the system here, and while Autopilot isn't a perfect system and while we've criticized its name as being somewhat misleading, the fault here doesn't seem to lie solely with Tesla.

The moral of the story here is that the advanced driver assistance systems, like Autopilot, that are found in many of the vehicles being sold today are not a form of self-driving. There is no "self-driving" car on sale today, and it's the responsibility of the driver to pay attention to the world around them as they drive.

Tesla didn't immediately respond to Roadshow's request for comment.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Insightful) by Nuke on Saturday September 07 2019, @09:02AM (2 children)

    by Nuke (3162) on Saturday September 07 2019, @09:02AM (#890898)

    FTFA :

    while Autopilot isn't a perfect system and while we've criticized its name as being somewhat misleading, the fault here doesn't seem to lie solely with Tesla.

    That makes it sound like Tesla is let off the hook. It is Tesla's fault for marketing a car that is not reliable enough for SD, but has enough SD-like features to tempt users to treat it as if it was. Tesla makes the mistake of underestimating the stupidity of the public.

    As for the driver, having spectacularly demonstrated that he is an idiot, he should simply be banned from driving any sort of car again : his particular problem is thus easliy solved.

    • (Score: 4, Interesting) by Anonymous Coward on Saturday September 07 2019, @11:39AM

      by Anonymous Coward on Saturday September 07 2019, @11:39AM (#890932)

      Agreed.

      However, a few years back there was an article from Google about their own self-driving car research. They hired people to test their self-driving car prototypes and paid them to keep their hands on the wheel and their foot near the brake at all times. Their only job, again, was to keep their hands on the wheel and their foot near the brake. And 100% of the people hired - high school graduates, college graduates, PhD engineers, researchers - stopped paying attention eventually. Some took a few days, some took a few weeks, some took a few months. But nobody kept focus. So the lesson Google learned, that Tesla didn't, is that you can't release a self-driving system to customers until it's effectively perfect.

      I'm no fan of Big Brother Google. But in this they're smarter and safer than Tesla.

    • (Score: 2, Disagree) by theluggage on Saturday September 07 2019, @02:18PM

      by theluggage (1797) on Saturday September 07 2019, @02:18PM (#890976)

      Sorry - you're forgetting the law of conservation of responsibility. You gotta have a victim and a villain in every case. It has to be either 100% Tesla's fault (otherwise you're "victim-blaming") or 100% the driver's fault (or you're perpetrating the socialist "nanny state"). If every issue wasn't turned into a false dichotomy, how would social media pundits know which side to take?

      Just imagine, if almost everything that ever went wrong in the world had multiple contributing factors and a complicated web of blame, half the cases in court would get thrown out because both sides were at fault, and the streets would be full of starving lawyers...

      Of course, the real victims here are any innocent bystanders who get hit by irresponsible drivers driving irresponsibly-marketed cars... except, nowadays, the innocent bystander was probably meandering across the road while updating their Instagram...

      (Sorry - it's been one of those weeks...)

  • (Score: 4, Funny) by Rosco P. Coltrane on Saturday September 07 2019, @09:12AM

    by Rosco P. Coltrane (4757) on Saturday September 07 2019, @09:12AM (#890901)

    Don't call it "autopilot". Call it "virtual teenage chauffeur": that'll scare Tesla owners into paying attention.

  • (Score: 2) by Farkus888 on Saturday September 07 2019, @09:12AM (5 children)

    by Farkus888 (5159) on Saturday September 07 2019, @09:12AM (#890902)

    It is amazing that this many people this dumb can afford a car this expensive. Draws into question the story we are all told about smarts being the key to success these days. Of course I also saw someone "driving" a Harley on the interstate with their arms crossed recently, so it isn't all autopilots fault.

    • (Score: 3, Funny) by Nuke on Saturday September 07 2019, @09:17AM

      by Nuke (3162) on Saturday September 07 2019, @09:17AM (#890903)

      Obviously, the Harley had an Autopilot.

    • (Score: 2) by MostCynical on Saturday September 07 2019, @09:47AM

      by MostCynical (2589) on Saturday September 07 2019, @09:47AM (#890907) Journal

      Was his fly undone?

      --
      "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
    • (Score: 2) by acid andy on Saturday September 07 2019, @11:47AM

      by acid andy (1683) on Saturday September 07 2019, @11:47AM (#890933) Homepage Journal

      Credit?

      --
      If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
    • (Score: 3, Interesting) by Anonymous Coward on Saturday September 07 2019, @08:05PM

      by Anonymous Coward on Saturday September 07 2019, @08:05PM (#891070)

      > "driving" a Harley on the interstate with their arms crossed

      It wasn't me...but if you were in western PA in the early 1980s it could have been me. I was a test rider for one of the tire companies, who were trying to get some tire business from Harley-Davidson. One problem was steering shake on longitudinal rain grooves, which were cut into a stretch of rural interstate that we used for testing. The test was to get lined up in the lane, check for nearby traffic (usually none), then lock the throttle and ride no hands for several miles to the next exit. The motorcycle had a data recorder for speed, steering angle and several other channels (have forgotten details). The data was analyzed later (no laptops back then) and was combined with my subjective comments into the final report. There was huge variation between the different experimental tires I rode, some barely noticed the grooves, others latched on to the grooves and shook the bike continually--but it always went straight in the macro sense, never had to grab the bars to keep in my lane.

      It turned out to be more comfortable to ride from the passenger's seat, so I would move back after locking the throttle, small amounts of body lean would nudge it to stay in lane... That Harley was so stable that what might have looked crazy from the outside was trivial as the rider/driver.

      At the same time, I was just as "hyper aware" as when riding any motorcycle, watching for nearby animals, debris on the road, etc--and also watching the steering shake so I could give a good subjective report on each tire.

    • (Score: 0) by Anonymous Coward on Sunday September 08 2019, @12:19AM

      by Anonymous Coward on Sunday September 08 2019, @12:19AM (#891122)

      It is amazing that this many people this dumb can afford a car this expensive.

      They are your bosses: The people who get paid much more than you, while possibly knowing much less than you, to get your work to provide more benefit for the company than you both cost.

  • (Score: 2) by Knowledge Troll on Saturday September 07 2019, @10:24AM (1 child)

    by Knowledge Troll (5948) on Saturday September 07 2019, @10:24AM (#890914) Homepage Journal

    Faulting the Tesla customers for not following the rules/documentation associated with the autopilot system should be called out and fault does belong with those people. However Elon tweets about how cool it is to tweet while his car drives him around. Elon doesn't even follow his own rules for supervised operation of autopilot. All kinds of fault belongs right there with Musk setting a garbage example for his customers behavior.

  • (Score: 0) by Anonymous Coward on Saturday September 07 2019, @10:30AM

    by Anonymous Coward on Saturday September 07 2019, @10:30AM (#890915)

    Los Angeles? An identical incident involving a Tesla on autopilot hitting a truck in full speed happened in Moscow a couple of months ago. Driver survived since he tweeted about it later.

  • (Score: 5, Insightful) by Quicksilver on Saturday September 07 2019, @10:39AM (4 children)

    by Quicksilver (1821) on Saturday September 07 2019, @10:39AM (#890917)

    So the basic idea is we will make a machine that takes care of everything but we will "require an operator to attentively monitor its operation".

    So the idea is that we are going to have people be completely unengaged from the process of driving but somehow have heightened awareness of what is going on around them. REALLY? You can't even get people who are the only control of a car to stay engaged so how do you think insulating them from the process of negotiating 2 tons of vehicle is going to work out?

    It is very simple: Any time you automate a human ability the human will lose that ability.

    • (Score: 4, Interesting) by Nuke on Saturday September 07 2019, @12:15PM (2 children)

      by Nuke (3162) on Saturday September 07 2019, @12:15PM (#890936)

      So the basic idea is we will make a machine that takes care of everything but we will "require an operator to attentively monitor its operation".

      The human's mental task is greater with Autopilot than if he were driving manually.

      This is his mental task when driving manually :
      1) Decide what should be done next
      2) That's it

      This is his mental task when using Autopilot :
      1) Decide what should be done next
      2) Decide whether the Autopilot is doing (1) or doing something different
      3) If (from (2)) Autopilot is doing something different, decide whether its alternative method is acceptable or needs to be over-ridden

      Using Autopilot sounds really stressful to me. The only thing it seems to save (at best) is the physical "effort" of steering and braking. I don't know about most cars, but these are almost effortless power-assisted operations on mine.

      Of course, nothing forces the Autopilot equipped driver to go through the routine I have described above. He can just switch off mentally and eat a bagel and drink a coffee instead.

      • (Score: 3, Interesting) by acid andy on Saturday September 07 2019, @01:51PM (1 child)

        by acid andy (1683) on Saturday September 07 2019, @01:51PM (#890962) Homepage Journal

        That mental task is basically what back seat (or passenger seat) drivers like to do most of the time. You would hope also that a competent driving instructor would be even better at it. That still leaves the questions though of what percentage of drivers are good back seat drivers and, as you note, whether Autopilot would be of any benefit at all to them. I think I agree that it wouldn't.

        --
        If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
        • (Score: 3, Insightful) by theluggage on Saturday September 07 2019, @02:33PM

          by theluggage (1797) on Saturday September 07 2019, @02:33PM (#890979)

          You would hope also that a competent driving instructor would be even better at it.

          ...and those are the only people who, at the moment, should be testing any "driver assistance" device capable of being (ab)used as "self-driving" (other than, may be, emergency braking or warning systems). After taking some sort of test to ensure the "competent" bit. While wired with a mike and required to make a commentary on the vehicle's progress (which might help to keep them focussed).

          Unfortunately, its much cheaper - maybe actually profitable - to "crowdsource" crucial safety testing to anybody who can afford (a loan for) a luxury car.

          Self-driving isn't ready until it is ready - and it will be ready when it is safe and legal to use while drunk, falling asleep or in iPhone Zombie mode - because idiots will use it like that.

    • (Score: 0) by Anonymous Coward on Sunday September 08 2019, @05:12AM

      by Anonymous Coward on Sunday September 08 2019, @05:12AM (#891199)

      It's the test driver's job. Shorter test drives could help, or they could be forced to switch modes at every fifth intersection or something. You could collect useful data in either mode.

  • (Score: 2) by HiThere on Saturday September 07 2019, @04:07PM (2 children)

    by HiThere (866) Subscriber Badge on Saturday September 07 2019, @04:07PM (#890997) Journal

    Most of the above comments are reasonable, considering this case as an individual case. Another question of interest is:
    "In the current state of development of automated driving, is the automation safer than the average driver?"

    I've seen various assertions in both directions, but haven't really seen any plausible analysis.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 1, Insightful) by Anonymous Coward on Saturday September 07 2019, @08:15PM

      by Anonymous Coward on Saturday September 07 2019, @08:15PM (#891076)

      > is the automation safer than the average driver?"

      I don't give a shit about that criteria. It's got to be *much* safer than the average driver, given that the average includes drunks and all kinds impaired/distracted drivers.

      Next time, perhaps you will write something like this:
          "is the automation safer than an experienced, alert, attentive and unimpaired driver?"

      Because that's what I am, in particular when driving my car with manual transmission.

    • (Score: 3, Funny) by Anonymous Coward on Saturday September 07 2019, @11:26PM

      by Anonymous Coward on Saturday September 07 2019, @11:26PM (#891112)

      If the artical intelligence of autopiloting a car is anything like the artificial intelligent of autocorrecting my typing, it not really yet.

  • (Score: 0) by Anonymous Coward on Saturday September 07 2019, @08:09PM (5 children)

    by Anonymous Coward on Saturday September 07 2019, @08:09PM (#891074)

    Not driver replacement. Yet.

    • (Score: 2) by All Your Lawn Are Belong To Us on Sunday September 08 2019, @02:30AM (4 children)

      by All Your Lawn Are Belong To Us (6553) on Sunday September 08 2019, @02:30AM (#891145) Journal

      Assists with what? If the driver is supposed to have hands on wheels (and presumably ready to mash foot on brakes), and the driver has to be as alert and watchful as if he were driving himself, then what is the system assisting with exactly?

      --
      This sig for rent.
      • (Score: 0) by Anonymous Coward on Sunday September 08 2019, @05:13AM

        by Anonymous Coward on Sunday September 08 2019, @05:13AM (#891200)

        Causing anxiety/carelessness.

      • (Score: 0) by Anonymous Coward on Sunday September 08 2019, @02:27PM (2 children)

        by Anonymous Coward on Sunday September 08 2019, @02:27PM (#891299)

        Assists with what?

        Assist in not driving off the road or plowing into a pedestrian. Assist in not confusing gas with a break.

        Like airbags and seat belts that assist you in not dying in a crash. Maybe they don't do a perfect job, but better than nothing.

        • (Score: 0) by Anonymous Coward on Monday September 09 2019, @02:16AM

          by Anonymous Coward on Monday September 09 2019, @02:16AM (#891498)

          Um, didn't you read the story? In this case (and a few others that appear similar) the "assist" took the Tesla straight into the back of the stopped fire truck. In another case it was straight into a temporary barrier set up at a freeway lane closing. Since it's happened more than once, this behavior was not trained out of the system after the first crash, and it may not be possible to fix--since the software would be constantly braking for stopped objects near the car's path (like road signs)...so they have to be ignored.

          IMHO, Tesla is using their customers to beta test their system. But, many of the customers don't even know what beta testing means. Personally, I only know one couple that own a Model S who also realize what "Autopilot" is--they are both experienced software developers and fully understand that they are beta testing. As they use the software they are also keeping track of good and bad behaviors. I fear (a bit) for the lives of the other Tesla owners that I know.

        • (Score: 1, Insightful) by Anonymous Coward on Monday September 09 2019, @05:50PM

          by Anonymous Coward on Monday September 09 2019, @05:50PM (#891767)

          Airbags and seat belts do things that one cannot do in a crash: Deploy a (relatively) gradual counterforce to slow your impact (in a way your arms and legs cannot do) and to restrain you in the coronal plane (in a way your arms/legs cannot do) at any speed that matters. They are providing assistance.

          Not driving off the road or plowing into a pedestrian are things that, even with autopilot, the driver is still supposed to be responsible for not allowing. That the "driver assist" still permits that to happen in given circumstances proves that it is not ready to take those functions independently. That one has to still have hands on controls during these phases prove they are not assisting you - you still deploy awareness and be prepared to do the work to avoid a crash at all times.

          If you are my assistant and we are nailing 2x4's in a frame, and your job is to hold my nails and hand them to me, if I nevertheless have to have my hand around the nails at all moments to avoid your dropping them then you are not truly assisting me in any way but name.

(1)