Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday January 25 2018, @02:16AM   Printer-friendly
from the stay-alert-stay-alive dept.

El Reg reports

[January 23] a Tesla Model S slammed into a stationary firetruck at around 65mph on Interstate 405 in Culver City, California. The car was driven under the fire engine, although the driver was able to walk away from the crash uninjured and refused an offer of medical treatment.

The motorist claimed the Model S was driving with Autopilot enabled when it crammed itself under the truck. Autopilot is Tesla's super-cruise-control system. It's not a fully autonomous driving system.

[...] The fire truck was parked in the carshare lane of the road with its lights flashing. None of the fire crew were hurt, although Powell noted that if his team had been in their usual position at the back of the truck then there "probably would not have been a very good outcome."

Tesla will no doubt be going over the car's computer logs to determine exactly what happened, something the California Highway Patrol will also be interested in. If this was a case of the driver sticking on Autopilot, and forgetting their responsibility to watch the road ahead it wouldn't be the first time.

In 2016, a driver was killed after both he and the Tesla systems missed a lorry pulling across the highway. A subsequent investigation by the US National Transportation Safety Board found the driver was speeding and had been warned by the car six times to keep his hands on the wheel.

Tesla has since beefed up the alerts the car will give a driver if it feels they aren't paying full attention to the road. The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.

Previous: Tesla's Semiautonomous System Contributed to Fatal Crash


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by tftp on Thursday January 25 2018, @03:16AM (24 children)

    by tftp (806) on Thursday January 25 2018, @03:16AM (#627523) Homepage

    Here I presume that the driver said the truth and the autopilot was engaged. But how could it be that neither the bag of salty water nor the box of silicon shards could see a fire truck with all lights on? Most importantly here, why the Tesla's autopilot is so bad? Perhaps we (the humanity) want to ban autopilots that work in 99.9% of cases and kill the driver in 0.01% ? In other words, we want either no autopilot, or an autopilot that is real (Waymo etc.) - with cameras, lidars, etc. A halfway beast is too dangerous.

    Starting Score:    1  point
    Moderation   +2  
       Insightful=1, Interesting=1, Total=2
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: -1, Troll) by Anonymous Coward on Thursday January 25 2018, @03:23AM (3 children)

    by Anonymous Coward on Thursday January 25 2018, @03:23AM (#627530)

    Seems pretty obvious to me, no one trained the "AI" to deal with an emergency vehicle stopped in the HOV (carpool/hybrid/EV) lane. Just another example of Tesla using its customers for beta testing, nothing to see here, move along please...

    • (Score: 5, Informative) by Anonymous Coward on Thursday January 25 2018, @01:00PM (2 children)

      by Anonymous Coward on Thursday January 25 2018, @01:00PM (#627660)

      Update from Wired -- https://www.wired.com/story/tesla-autopilot-why-crash-radar/ [wired.com]

      Why Tesla's Autopilot Can't See a Stopped Firetruck
      ...
      This surprisingly non-deadly debacle also raises a technical question: How is it possible that one of the most advanced driving systems on the planet doesn't see a freaking fire truck, dead ahead?

      Tesla didn't confirm the car was running Autopilot at the time of the crash, but its manual does warn that the system is ill-equipped to handle this exact sort of situation: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

      Volvo's semi-autonomous system, Pilot Assist, has the same shortcoming. Say the car in front of the Volvo changes lanes or turns off the road, leaving nothing between the Volvo and a stopped car. "Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed," Volvo's manual reads, meaning the cruise speed the driver punched in. "The driver must then intervene and apply the brakes.” In other words, your Volvo won't brake to avoid hitting a stopped car that suddenly appears up ahead. It might even accelerate towards it.

      The same is true for any car currently equipped with adaptive cruise control, or automated emergency braking. It sounds like a glaring flaw, the kind of horrible mistake engineers race to eliminate. Nope. These systems are designed to ignore static obstacles because otherwise, they couldn't work at all.

      Looks like parent had a point, even if made in a crude way.

      Wired story has more explanation on the design tradeoffs involved, and why Lidar is needed (because, unlike radar, it can distinguish between road furniture (signs, etc) and an actual obstacle.

      • (Score: 2, Interesting) by tftp on Thursday January 25 2018, @07:19PM

        by tftp (806) on Thursday January 25 2018, @07:19PM (#627818) Homepage
        This answer is exactly what I was asking for! But with this knowledge it is beyond scary to use this killer feature. If the leading car leaves the lane, there must be a reason! Usually it's not a problem, but sometimes there is something ahead in the lane. In my experience it was rocks, wood, a large chair, a broken car, a police car, a car on a red light (very common) ... every driver watches for these things, as he may need to brake right away. However this "autopilot" acts opposite to what a driver would do.
      • (Score: 3, Informative) by gawdonblue on Thursday January 25 2018, @11:25PM

        by gawdonblue (412) on Thursday January 25 2018, @11:25PM (#627948)

        I was a passenger in one of these adaptive cruise control cars yesterday when both the car in front made and the car I was in made a turn and they very nearly collided. Basically the car ahead slowed to take the turn and so the car I was in automatically slowed to keep the distance, but when that car disappeared just around the corner the car I was in accelerated into the "clear" space and our driver had to brake very heavily to make the turn and then steer to avoid the slower car in front. It was a little bit scary.

  • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @03:58AM (1 child)

    by Anonymous Coward on Thursday January 25 2018, @03:58AM (#627536)

    Doesn't necessarily require LIDAR. If safety stats are better than human, the technology should be allowed. If it's worse than humans, it needs more development.

    • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @09:48PM

      by Anonymous Coward on Thursday January 25 2018, @09:48PM (#627883)

      > If safety stats are better than human, the technology should be allowed.

      Which humans are you doing stats on?

      I claim that I'm about 10x less likely to get into a serious accident than the lumped total of the USA driving population. I don't drink or otherwise drive impaired or sleepy. My car is well maintained, I keep the windows clear (of snow, etc), use seat belts all the time, and the lights are all working. I'm well past the hormone-infused youth stage, but not yet old enough that my senses are going bad. And I've attended several advanced driver training classes, starting when I was a teen and including some race track training (go fast on the track, not on the road).

      That's not to tempt fate and say that I'm not going to have a bad accident, but I believe my chances of a safe trip are much higher than the average.

  • (Score: 5, Insightful) by MostCynical on Thursday January 25 2018, @04:04AM (1 child)

    by MostCynical (2589) on Thursday January 25 2018, @04:04AM (#627538) Journal

    Autopilot engaged.

    Brain disengaged.

    Car totalled.

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
    • (Score: 2) by bob_super on Thursday January 25 2018, @06:36PM

      by bob_super (1357) on Thursday January 25 2018, @06:36PM (#627794)

      Driver unterminated, Darwin Saddened.
      Fire pants browned.

  • (Score: 3, Interesting) by coolgopher on Thursday January 25 2018, @04:05AM

    by coolgopher (1157) on Thursday January 25 2018, @04:05AM (#627539)

    It does seem to me that the most likely explanation is that the autopilot was not, in fact, engaged at the time.

    Not having a Tesla, I don't know under what conditions it might disengage the autopilot, but I have a vague recollection of giving off a bunch of beeps before doing so. Perhaps those beeps were ignored?

    The investigation should tell us more once it's done.

  • (Score: 1, Insightful) by Anonymous Coward on Thursday January 25 2018, @04:09AM (4 children)

    by Anonymous Coward on Thursday January 25 2018, @04:09AM (#627541)

    In my younger years, there was an urban legend about the operator of a motor home who set the "cruise control" and walked back into the kitchenette to make a sandwich.
    The vehicle crashed, of course.
    (Labeling it "speed control" would make it less prone to an erroneousness judgment of the device's capabilities.)

    In the prior incident (the "lorry" thing mentioned in a comment above), the driver was speeding and watching a Harry Potter movie.
    It wouldn't surprise me a bit if there was a similar distraction involved in this latest case.

    ...and, as TFS mentions, the vehicle will bitch at you if you take your hands off the steering wheel.
    It should be clear that it is just an aid.

    -- OriginalOwner_ [soylentnews.org]

    • (Score: 1) by anubi on Thursday January 25 2018, @10:03AM (2 children)

      by anubi (2828) on Thursday January 25 2018, @10:03AM (#627615) Journal

      I have a "speed control" in my van. I will not use it for this very reason.

      When I re-do its wiring, that is going to be one of the first things to be permanently removed from service.

      I take driving a vehicle extremely seriously. Not only for me, and my property, but everyone else on the road as well.

      Nobody's safe if people aren't paying attention when driving.

      Having a vehicle under control of an inattentive driver is worse than putting a live gun in a child's playpen. While a child may take out a person, an inattentive driver can easily wipe out the whole family, possibly two or more families, - in one big bang.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
      • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @02:50PM (1 child)

        by Anonymous Coward on Thursday January 25 2018, @02:50PM (#627696)

        That is an overreaction.

        Cruise control removes the need to continually monitor speed. I tend to go too fast if left to my own device. Cruise control keeps me from doing that. On top of it I have found myself following too close in the past, adaptive cruise control keeps that from happening.

        So kudos, if you are the perfectly attentive driver that can constantly monitor speed, lane position, distance from the car in front of you, traffic to the side and behind you for many hours on a road trip. I doubt your nearly as good as you think you are.

        You arent the one in the left hand lane going five under are you? Its my experience those drivers also have an inflated sense of their own driving ability.

        • (Score: 2) by bob_super on Thursday January 25 2018, @06:43PM

          by bob_super (1357) on Thursday January 25 2018, @06:43PM (#627800)

          > You arent the one in the left hand lane going five under are you?
          > Its my experience those drivers also have an inflated sense of their own driving ability.

          Not surprising, since they are always told by the insurance that they are at "no fault" for getting rear-ended.
          If I was a cop, left-lane cruisers would be in pain (their wallets, at least). What's so evil about the right lane? I love it because it's always the empty one.

    • (Score: 1) by khallow on Thursday January 25 2018, @03:24PM

      by khallow (3766) Subscriber Badge on Thursday January 25 2018, @03:24PM (#627706) Journal

      (Labeling it "speed control" would make it less prone to an erroneousness judgment of the device's capabilities.)

      Guess I don't see the difference between "speed" and "cruise" here. When it comes to idiots, I doubt such labels matter much.

  • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @07:08AM

    by Anonymous Coward on Thursday January 25 2018, @07:08AM (#627579)

    Well, since the car crammed itself under the firetruck, it's possible that the box of silicon shards missed it simply because to was too high and therefore out of whatever height range the cameras/lidar/etc. are set for. As for the bag of salty water, there are oodles of possibilities: chemical impairment, cell/smart phone distraction, sleep deprivation, being an idiot, eating or drinking, and that's without looking up any statistics on car crash causes. I'm sure I missed a few.

  • (Score: 4, Informative) by GreatAuntAnesthesia on Thursday January 25 2018, @11:00AM (5 children)

    by GreatAuntAnesthesia (3275) on Thursday January 25 2018, @11:00AM (#627628) Journal

    why the Tesla's autopilot is so bad?

    The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.

    Sounds to me like Tesla's autopilot is actually pretty good. Good doesn't have to mean "no accidents at all", it just has to mean "as few or fewer accidents than a human driver".
    Note also that the human driver should have been watching the road in this case. That's how Tesla's autopilot is supposed to be used.

    Perhaps we (the humanity) want to ban autopilots that work in 99.9% of cases and kill the driver in 0.01%

    Well, rather than working with numbers pulled out of your exhaust pipe, why don't we look at some actual, real statistics? [electrek.co]
    Turns out the Tesla autosteer crash rate is about 0.08 crashes per million miles. Unfortunately I can't find any relevant statistics for human drivers for comparison (all my google searches come up with fatalities per million miles, not crashes per million miles.) However if you consider that a million miles represents a lifetime of driving for your average driver (10-20k miles per year) then I would say having less than a one-in-one hundred chance of reportable crash (which won't necessarily be fatal or even life-changing) in fifty to a hundred years of driving to be pretty good, and probably at least as safe as the average human driver.

    • (Score: 3, Interesting) by nobu_the_bard on Thursday January 25 2018, @02:50PM

      by nobu_the_bard (6373) on Thursday January 25 2018, @02:50PM (#627695)

      That seems probable. The fact that half the Tesla reports make the news probably causes the same problem airplane crashes have where it distorts the perception of how common they are.

    • (Score: 2) by AthanasiusKircher on Thursday January 25 2018, @03:12PM (2 children)

      by AthanasiusKircher (5291) on Thursday January 25 2018, @03:12PM (#627701) Journal

      Before I reply, note first that I mostly agree with your general point -- the Tesla "autopilot" stuff seems like it probably does a lot more good than bad. However, I'd argue about the standards you're using a bit...

      The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.
      Sounds to me like Tesla's autopilot is actually pretty good. Good doesn't have to mean "no accidents at all", it just has to mean "as few or fewer accidents than a human driver".

      That shouldn't be the relevant standard for "good." One should also consider whether or not the system causes accidents that would not have occurred in the first place. A "good" system might cause a few new incidents in unexpected scenarios, but usually our standards shouldn't just about overall accident rate.

      To put this in a different context, say you had a daily supplement that "cuts fatal heart attacks by 40%." Sounds great, right? As a pure stat, it certainly sounds promising. But say I told you that in a given population, there were generally 1000 fatal heart attacks. And this supplement seemed to prevent all 1000 of those high-risk folks from having a heart attack. But it also CAUSED 600 fatal heart attacks to happen in otherwise relatively low-risk folks. Overall, it cut heart attack incidence in the population by 40%, but I don't think we'd call this a "good" drug... it's killing large numbers of people, even while saving others. The side effects may not be worth the benefit.

      To be clear, I'm NOT saying that's true here with Tesla. But it's an important reason why we should pay attention to cases that appear like they might be a failure of the Autopilot system to have reasonable behavior.

      I'll also agree with you if you argue that Tesla gets a lot of negative bad press for such incidents. But they also invite it. Musk tries to get all the media attention he can, and that means he's also going to get negative stuff when something bad happens. He also tends to get really defensive at any criticism. AND (perhaps most importantly), Tesla steadfastly refuses to alter the name of their "Autopilot" feature, despite the fact that it's clear huge numbers of people who hear that name misunderstand that it's mostly just enhanced cruise control. So, you can argue about the idiots who abuse it, but I think fewer idiots would abuse it if it had a different name. But that's Tesla's marketing decision -- they obviously think they'll get more attention and sell more cars with "Autopilot," so they have to suck it up when negative press comes along because idiots misunderstand that name.

      Well, rather than working with numbers pulled out of your exhaust pipe, why don't we look at some actual, real statistics? [electrek.co]
      Turns out the Tesla autosteer crash rate is about 0.08 crashes per million miles. Unfortunately I can't find any relevant statistics for human drivers for comparison (all my google searches come up with fatalities per million miles, not crashes per million miles.)

      Again, statistical comparisons should be done with care. It doesn't make much sense to compare an enhanced cruise control feature that's likely most used in open-highway situations to a general driving stat for humans (which includes high-density traffic situations where most crashes occur).

      Perhaps a more apt comparison would be to look at the number of crashes with "Autopilot" vs. the number of crashes in cars with humans using standard cruise control. That would probably be a more like-to-like comparison. I suspect "Autopilot" would do significantly better there too, because normal cruise control (like "Autopilot") tends to lead people to be more distracted while driving... but normal cruise control has no ability to respond, whereas "Autopilot" has more enhanced safety features.

      • (Score: 2) by GreatAuntAnesthesia on Thursday January 25 2018, @04:29PM (1 child)

        by GreatAuntAnesthesia (3275) on Thursday January 25 2018, @04:29PM (#627736) Journal

        Interesting points, but I don't think your heart attack analogy stands up.

        In your scenario, you have a clearly defined default position, a "natural" state : Everybody is at risk of a heart attack, some high risk and some low.
        I'd argue that in the world of cars, there is no natural state. Human drivers are the default, but only because that technology was invented first. Everybody has a heart1, but not everybody has to have a car. Indeed, one could imagine a society with no cars at all. We choose to have cars in our society, to pay their cost in lives for all the benefits and conveniences they bring. The autopilot isn't disrupting the natural order of things in the way your heart attack drug is. Or if it is, then the human drivers are too, and the only difference between the two options is the number of deaths.

        Put it this way: If we existed in some improbable alternate universe where Tesla Autopilot had been invented before manual controls, would we be sat here arguing whether putting humans behind the wheel would rightfully save some lives at the expense of many more?

        1Insert obligatory snark here about Rupert Murdoch / Dick Cheney / Donald Trump.

        • (Score: 2) by AthanasiusKircher on Thursday January 25 2018, @07:26PM

          by AthanasiusKircher (5291) on Thursday January 25 2018, @07:26PM (#627821) Journal

          Put it this way: If we existed in some improbable alternate universe where Tesla Autopilot had been invented before manual controls, would we be sat here arguing whether putting humans behind the wheel would rightfully save some lives at the expense of many more?

          I take your point. But in the very way you just framed that, you automatically are presuming a beneficial outcome. My point wasn't just about Tesla Autopilot (which I explicitly admitted likely prevents a lot more issues than it causes), but about judging such automated technologies in general.

          For example, many people who argue about completely autonomous cars phrase it as you did in your previous post -- i.e., we just get to the point that the accident stats are as good as average stats for human drivers to view them as a good alternative. But I don't think that'd be a comfort to someone who was killed by an autonomous car acting in a completely stupid manner because the bugs weren't worked out.

          Bottom line is that there will always be side effects to the adoption of new technology, and some of those may be negative. All I'm saying is that it's rational to factor that into judging whether the tech is "better" than humans. Lots of accidents are caused by STUPID human error that is largely preventable (e.g., speeding, following too closely, etc.). I tend to be a much more cautious and conservative driver than average, so quoting average accident rates is not going to convince me to put my safety in the hands of some algorithm.

          But even if the algorithm had the stats of a "good driver," I also want to know not only that it would successfully navigate potential accident scenarios better than I would in some cases, but that it's also not going to randomly kill me by doing something completely weird and unpredictable that I, as a driver, would never do. And if such latter scenarios were more than a freak accident -- that they actually occurred with some regularity -- are you really telling me that you'd want to put your safety in the hands of such an algorithm, just based on the promise that it "performs as good as the average human driver" or even slightly better in terms of overall accident stats?

          Again, I'm not arguing that Tesla's feature isn't helpful. Only that unexpected negative outcomes should be also be a serious factor to consider, along with summary stats.

    • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @03:22PM

      by Anonymous Coward on Thursday January 25 2018, @03:22PM (#627704)

      See update from Wired mag, link given above in post
      https://soylentnews.org/comments.pl?noupdate=1&sid=23744&page=1&cid=627660#commentwrap [soylentnews.org]

      Basically Tesla Autopilot doesn't react to stationary objects -- if a lead car moves over a lane to miss a stopped vehicle, the Tesla behind with Autopilot active will stay in lane -- Wired claims this is in the Tesla manual on the Autopilot system. Also claims that the Volvo automatic steering + automatic cruise control works the same way.

  • (Score: 2) by theluggage on Thursday January 25 2018, @11:44AM (1 child)

    by theluggage (1797) on Thursday January 25 2018, @11:44AM (#627643)

    Most importantly here, why the Tesla's autopilot is so bad?

    It's not that Autopilot is bad, its that we apes are actually very good at driving - when we're focussed - and emulating it is a really, really difficult challenge. Currently, they've picked the low-hanging fruit such as parallel parking or staying in lane on a highway designed for safe high-speed driving.

    What we brine sacks are not so good at is understanding that we can't maintain that performance while (e.g.) drunk, asleep, texting or otherwise away with the fairies - something that law enforcement and road safety campaigners have tried and failed to drum into us.

    So, given that it has proven so impossible to persuade some drivers to stay focussed and sober when they are solely responsible for driving, if you show someone a button called "Autopilot" then you might as well also hand them a bottle of vodka and some free credits on Candy Crush.

    A halfway beast is too dangerous.

    This. There's a quantum leap required between what is currently available and a system which is good enough to allow the meatbag to safely sit back and watch a movie, as meatbags will inevitably do. However, in this age of (fr)agile development, when one of the major self-driving players, Google, is also the acclaimed master of the perpetual beta, that's not what the industry wants to hear.

    • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @08:25PM

      by Anonymous Coward on Thursday January 25 2018, @08:25PM (#627836)

      I like that.
      I'll be looking for a place to re-use it. 8-)

      -- OriginalOwner_ [soylentnews.org]

  • (Score: 2) by DeathMonkey on Thursday January 25 2018, @06:38PM

    by DeathMonkey (1380) on Thursday January 25 2018, @06:38PM (#627796) Journal

    Here I presume that the driver said the truth and the autopilot was engaged.

    Clearly you've never supported end-users before!