Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Thursday January 25 2018, @02:16AM   Printer-friendly
from the stay-alert-stay-alive dept.

El Reg reports

[January 23] a Tesla Model S slammed into a stationary firetruck at around 65mph on Interstate 405 in Culver City, California. The car was driven under the fire engine, although the driver was able to walk away from the crash uninjured and refused an offer of medical treatment.

The motorist claimed the Model S was driving with Autopilot enabled when it crammed itself under the truck. Autopilot is Tesla's super-cruise-control system. It's not a fully autonomous driving system.

[...] The fire truck was parked in the carshare lane of the road with its lights flashing. None of the fire crew were hurt, although Powell noted that if his team had been in their usual position at the back of the truck then there "probably would not have been a very good outcome."

Tesla will no doubt be going over the car's computer logs to determine exactly what happened, something the California Highway Patrol will also be interested in. If this was a case of the driver sticking on Autopilot, and forgetting their responsibility to watch the road ahead it wouldn't be the first time.

In 2016, a driver was killed after both he and the Tesla systems missed a lorry pulling across the highway. A subsequent investigation by the US National Transportation Safety Board found the driver was speeding and had been warned by the car six times to keep his hands on the wheel.

Tesla has since beefed up the alerts the car will give a driver if it feels they aren't paying full attention to the road. The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.

Previous: Tesla's Semiautonomous System Contributed to Fatal Crash


Original Submission

Related Stories

Tesla's Semiautonomous System Contributed to Fatal Crash 40 comments

http://abcnews.go.com/Technology/teslas-semi-autonomous-system-contributed-deadly-crash-feds/story?id=49795839

Federal investigators announced Tuesday that the design of Tesla's semiautonomous driving system allowed the driver of a Tesla Model S in a fatal 2016 crash with a semi-truck to rely too heavily on the car's automation.

"Tesla allowed the driver to use the system outside of the environment for which it was designed," said National Transportation Safety Board Chairman Robert Sumwalt. "The system gave far too much leeway to the driver to divert his attention."

The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation — or Autopilot, as Tesla calls the system. Tesla's system design was declared a contributing factor.

[...] A Tesla spokesperson provided a statement to ABC News that read, "We appreciate the NTSB's analysis of last year's tragic accident, and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."

According to The Associated Press, members of Brown's family said on Monday that they do not blame the car or the Autopilot system for his death.

A National Highway Traffic Safety Administration report on the crash can be found here. The NTSB has yet not published its full report; a synopsis of it can be found here.

Also at The Verge and CNN


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0, Redundant) by Arik on Thursday January 25 2018, @02:36AM (2 children)

    by Arik (4543) on Thursday January 25 2018, @02:36AM (#627514) Journal
    "In 2016, a driver was killed after both he and the Tesla systems missed a lorry pulling across the highway. "

    Ah, so this was in Britain, was it?

    "A subsequent investigation by the US National Transportation Safety Board found the driver was speeding and had been warned by the car six times to keep his hands on the wheel."

    Then why would the US authorities be investigating?

    Hmm, no, that doesn't make sense. I suspect it was a truck that pulled across the highway instead.
    --
    If laughter is the best medicine, who are the best doctors?
    • (Score: 4, Funny) by khallow on Thursday January 25 2018, @03:12AM

      by khallow (3766) Subscriber Badge on Thursday January 25 2018, @03:12AM (#627521) Journal
      The Register is British and fond of using these sorts of terms. For example, there are boffins all over the place. I doubt you will have much luck repelling this dastardly act of British imperialism.
    • (Score: 3, Touché) by hendrikboom on Thursday January 25 2018, @01:40PM

      by hendrikboom (1125) Subscriber Badge on Thursday January 25 2018, @01:40PM (#627677) Homepage Journal

      I suspect that in the Netherlands they would use the Dutch word for 'truck' no matter where in the world the accident occurred.

  • (Score: 0, Informative) by Anonymous Coward on Thursday January 25 2018, @03:13AM (8 children)

    by Anonymous Coward on Thursday January 25 2018, @03:13AM (#627522)

    "... although Powell noted ..."

    Mention who the fuck Powell is before describing what this Powell did. Didn't you learn anything from school?

    • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @03:16AM (5 children)

      by Anonymous Coward on Thursday January 25 2018, @03:16AM (#627525)

      Maybe you'd like to submit a story to SN, Mr/Mrs. peanut gallery.

      • (Score: 2, Insightful) by aristarchus on Thursday January 25 2018, @06:54AM (4 children)

        by aristarchus (2645) on Thursday January 25 2018, @06:54AM (#627577) Journal

        You could, but it probably wouldn't do any good. The eds wood just reject it, or worse, accept it and then bury it in their own private limbo of hostile user subs. Besides, everyone knows that Mr. Powell is the guy that saved the shark from the Telsa with a drone off of Lummox head, in New Northwest Anglia. Thus the "lorry". But a reminder, the submission guidelines document says that English is the only acceptable language on SoylentNews, so we'll have none of the foreign British language around here. This is local news aggregator, for Local People! (alt-right)

        • (Score: 2) by janrinok on Thursday January 25 2018, @06:42PM (3 children)

          by janrinok (52) Subscriber Badge on Thursday January 25 2018, @06:42PM (#627798) Journal
          We will print your submissions when you can write an unbiased and factual entry. You will get bonus points if you can leave the phrase 'alt-right' out of every sentence. Finally, the topic about which you write must be of interest to the majority, or a least a sizable minority, of our community. This is not your own personal blog where you get to spout out your own political views.--jr
          • (Score: 2) by aristarchus on Thursday January 25 2018, @11:19PM (2 children)

            by aristarchus (2645) on Thursday January 25 2018, @11:19PM (#627940) Journal

            Thanks for the response, janrinok! But it is off-topic. I was just responding to the response to criticism of the submission with the formerly valid, "if you don't like it, submit something yourself" response. Even if you do submit fair and unbiased submissions like I do, you can be rejected for no reason other than the political bias of a minority of the editorial team. (I don't include you in that group, janrinok.) I only submit things which I think will be of general interest to my fellow Soylentils. But it seems the "submit something yourself" obvious rebuttal is no longer appropriate.

            Well, at least now we know who "Powell" is.

            • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @11:36PM (1 child)

              by Anonymous Coward on Thursday January 25 2018, @11:36PM (#627963)

              You could y'know, try not spewing flamebait crap in your summaries. Please write a journal with links to the various stories you've had rejected, it would be interesting to see where the truth lies between janinrok's advice and your own idea of what is happening. I've only seen a few of your summaries that got rejected and at least one was pretty filled with flamebait crap. I've seen similar from other users, but as you point out there is a definite political bias by some of those who run this site.

              So, compile that list of stories, I'll read the summaries and give you a trolling score. 0 trolling = good submission, 10 trolling you are the wooorst. I dunno, something like that.

              • (Score: 2) by aristarchus on Thursday January 25 2018, @11:44PM

                by aristarchus (2645) on Thursday January 25 2018, @11:44PM (#627969) Journal

                Thank you for the offer, but you are an AC? What would be the point of submitting things for your judgment? Perhaps is we had a mod system for submissions? Then a lowly AC's opinion might count, and we could all #freearistarchus!!!

    • (Score: 1) by tftp on Thursday January 25 2018, @03:18AM (1 child)

      by tftp (806) on Thursday January 25 2018, @03:18AM (#627528) Homepage
      It's pretty obvious that Powell is the fire crew chief. Sapienti sat.
      • (Score: 5, Funny) by Anonymous Coward on Thursday January 25 2018, @04:33AM

        by Anonymous Coward on Thursday January 25 2018, @04:33AM (#627548)

        "Sapienti sat."

        Mention who the fuck Sapienti is before stating whether this Sapienti is sitting or not. Didn't you learn anything from the "but who was Powell" situation?

  • (Score: 3, Insightful) by tftp on Thursday January 25 2018, @03:16AM (24 children)

    by tftp (806) on Thursday January 25 2018, @03:16AM (#627523) Homepage

    Here I presume that the driver said the truth and the autopilot was engaged. But how could it be that neither the bag of salty water nor the box of silicon shards could see a fire truck with all lights on? Most importantly here, why the Tesla's autopilot is so bad? Perhaps we (the humanity) want to ban autopilots that work in 99.9% of cases and kill the driver in 0.01% ? In other words, we want either no autopilot, or an autopilot that is real (Waymo etc.) - with cameras, lidars, etc. A halfway beast is too dangerous.

    • (Score: -1, Troll) by Anonymous Coward on Thursday January 25 2018, @03:23AM (3 children)

      by Anonymous Coward on Thursday January 25 2018, @03:23AM (#627530)

      Seems pretty obvious to me, no one trained the "AI" to deal with an emergency vehicle stopped in the HOV (carpool/hybrid/EV) lane. Just another example of Tesla using its customers for beta testing, nothing to see here, move along please...

      • (Score: 5, Informative) by Anonymous Coward on Thursday January 25 2018, @01:00PM (2 children)

        by Anonymous Coward on Thursday January 25 2018, @01:00PM (#627660)

        Update from Wired -- https://www.wired.com/story/tesla-autopilot-why-crash-radar/ [wired.com]

        Why Tesla's Autopilot Can't See a Stopped Firetruck
        ...
        This surprisingly non-deadly debacle also raises a technical question: How is it possible that one of the most advanced driving systems on the planet doesn't see a freaking fire truck, dead ahead?

        Tesla didn't confirm the car was running Autopilot at the time of the crash, but its manual does warn that the system is ill-equipped to handle this exact sort of situation: “Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead.”

        Volvo's semi-autonomous system, Pilot Assist, has the same shortcoming. Say the car in front of the Volvo changes lanes or turns off the road, leaving nothing between the Volvo and a stopped car. "Pilot Assist will ignore the stationary vehicle and instead accelerate to the stored speed," Volvo's manual reads, meaning the cruise speed the driver punched in. "The driver must then intervene and apply the brakes.” In other words, your Volvo won't brake to avoid hitting a stopped car that suddenly appears up ahead. It might even accelerate towards it.

        The same is true for any car currently equipped with adaptive cruise control, or automated emergency braking. It sounds like a glaring flaw, the kind of horrible mistake engineers race to eliminate. Nope. These systems are designed to ignore static obstacles because otherwise, they couldn't work at all.

        Looks like parent had a point, even if made in a crude way.

        Wired story has more explanation on the design tradeoffs involved, and why Lidar is needed (because, unlike radar, it can distinguish between road furniture (signs, etc) and an actual obstacle.

        • (Score: 2, Interesting) by tftp on Thursday January 25 2018, @07:19PM

          by tftp (806) on Thursday January 25 2018, @07:19PM (#627818) Homepage
          This answer is exactly what I was asking for! But with this knowledge it is beyond scary to use this killer feature. If the leading car leaves the lane, there must be a reason! Usually it's not a problem, but sometimes there is something ahead in the lane. In my experience it was rocks, wood, a large chair, a broken car, a police car, a car on a red light (very common) ... every driver watches for these things, as he may need to brake right away. However this "autopilot" acts opposite to what a driver would do.
        • (Score: 3, Informative) by gawdonblue on Thursday January 25 2018, @11:25PM

          by gawdonblue (412) on Thursday January 25 2018, @11:25PM (#627948)

          I was a passenger in one of these adaptive cruise control cars yesterday when both the car in front made and the car I was in made a turn and they very nearly collided. Basically the car ahead slowed to take the turn and so the car I was in automatically slowed to keep the distance, but when that car disappeared just around the corner the car I was in accelerated into the "clear" space and our driver had to brake very heavily to make the turn and then steer to avoid the slower car in front. It was a little bit scary.

    • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @03:58AM (1 child)

      by Anonymous Coward on Thursday January 25 2018, @03:58AM (#627536)

      Doesn't necessarily require LIDAR. If safety stats are better than human, the technology should be allowed. If it's worse than humans, it needs more development.

      • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @09:48PM

        by Anonymous Coward on Thursday January 25 2018, @09:48PM (#627883)

        > If safety stats are better than human, the technology should be allowed.

        Which humans are you doing stats on?

        I claim that I'm about 10x less likely to get into a serious accident than the lumped total of the USA driving population. I don't drink or otherwise drive impaired or sleepy. My car is well maintained, I keep the windows clear (of snow, etc), use seat belts all the time, and the lights are all working. I'm well past the hormone-infused youth stage, but not yet old enough that my senses are going bad. And I've attended several advanced driver training classes, starting when I was a teen and including some race track training (go fast on the track, not on the road).

        That's not to tempt fate and say that I'm not going to have a bad accident, but I believe my chances of a safe trip are much higher than the average.

    • (Score: 5, Insightful) by MostCynical on Thursday January 25 2018, @04:04AM (1 child)

      by MostCynical (2589) on Thursday January 25 2018, @04:04AM (#627538) Journal

      Autopilot engaged.

      Brain disengaged.

      Car totalled.

      --
      "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
      • (Score: 2) by bob_super on Thursday January 25 2018, @06:36PM

        by bob_super (1357) on Thursday January 25 2018, @06:36PM (#627794)

        Driver unterminated, Darwin Saddened.
        Fire pants browned.

    • (Score: 3, Interesting) by coolgopher on Thursday January 25 2018, @04:05AM

      by coolgopher (1157) on Thursday January 25 2018, @04:05AM (#627539)

      It does seem to me that the most likely explanation is that the autopilot was not, in fact, engaged at the time.

      Not having a Tesla, I don't know under what conditions it might disengage the autopilot, but I have a vague recollection of giving off a bunch of beeps before doing so. Perhaps those beeps were ignored?

      The investigation should tell us more once it's done.

    • (Score: 1, Insightful) by Anonymous Coward on Thursday January 25 2018, @04:09AM (4 children)

      by Anonymous Coward on Thursday January 25 2018, @04:09AM (#627541)

      In my younger years, there was an urban legend about the operator of a motor home who set the "cruise control" and walked back into the kitchenette to make a sandwich.
      The vehicle crashed, of course.
      (Labeling it "speed control" would make it less prone to an erroneousness judgment of the device's capabilities.)

      In the prior incident (the "lorry" thing mentioned in a comment above), the driver was speeding and watching a Harry Potter movie.
      It wouldn't surprise me a bit if there was a similar distraction involved in this latest case.

      ...and, as TFS mentions, the vehicle will bitch at you if you take your hands off the steering wheel.
      It should be clear that it is just an aid.

      -- OriginalOwner_ [soylentnews.org]

      • (Score: 1) by anubi on Thursday January 25 2018, @10:03AM (2 children)

        by anubi (2828) on Thursday January 25 2018, @10:03AM (#627615) Journal

        I have a "speed control" in my van. I will not use it for this very reason.

        When I re-do its wiring, that is going to be one of the first things to be permanently removed from service.

        I take driving a vehicle extremely seriously. Not only for me, and my property, but everyone else on the road as well.

        Nobody's safe if people aren't paying attention when driving.

        Having a vehicle under control of an inattentive driver is worse than putting a live gun in a child's playpen. While a child may take out a person, an inattentive driver can easily wipe out the whole family, possibly two or more families, - in one big bang.

        --
        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
        • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @02:50PM (1 child)

          by Anonymous Coward on Thursday January 25 2018, @02:50PM (#627696)

          That is an overreaction.

          Cruise control removes the need to continually monitor speed. I tend to go too fast if left to my own device. Cruise control keeps me from doing that. On top of it I have found myself following too close in the past, adaptive cruise control keeps that from happening.

          So kudos, if you are the perfectly attentive driver that can constantly monitor speed, lane position, distance from the car in front of you, traffic to the side and behind you for many hours on a road trip. I doubt your nearly as good as you think you are.

          You arent the one in the left hand lane going five under are you? Its my experience those drivers also have an inflated sense of their own driving ability.

          • (Score: 2) by bob_super on Thursday January 25 2018, @06:43PM

            by bob_super (1357) on Thursday January 25 2018, @06:43PM (#627800)

            > You arent the one in the left hand lane going five under are you?
            > Its my experience those drivers also have an inflated sense of their own driving ability.

            Not surprising, since they are always told by the insurance that they are at "no fault" for getting rear-ended.
            If I was a cop, left-lane cruisers would be in pain (their wallets, at least). What's so evil about the right lane? I love it because it's always the empty one.

      • (Score: 1) by khallow on Thursday January 25 2018, @03:24PM

        by khallow (3766) Subscriber Badge on Thursday January 25 2018, @03:24PM (#627706) Journal

        (Labeling it "speed control" would make it less prone to an erroneousness judgment of the device's capabilities.)

        Guess I don't see the difference between "speed" and "cruise" here. When it comes to idiots, I doubt such labels matter much.

    • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @07:08AM

      by Anonymous Coward on Thursday January 25 2018, @07:08AM (#627579)

      Well, since the car crammed itself under the firetruck, it's possible that the box of silicon shards missed it simply because to was too high and therefore out of whatever height range the cameras/lidar/etc. are set for. As for the bag of salty water, there are oodles of possibilities: chemical impairment, cell/smart phone distraction, sleep deprivation, being an idiot, eating or drinking, and that's without looking up any statistics on car crash causes. I'm sure I missed a few.

    • (Score: 4, Informative) by GreatAuntAnesthesia on Thursday January 25 2018, @11:00AM (5 children)

      by GreatAuntAnesthesia (3275) on Thursday January 25 2018, @11:00AM (#627628) Journal

      why the Tesla's autopilot is so bad?

      The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.

      Sounds to me like Tesla's autopilot is actually pretty good. Good doesn't have to mean "no accidents at all", it just has to mean "as few or fewer accidents than a human driver".
      Note also that the human driver should have been watching the road in this case. That's how Tesla's autopilot is supposed to be used.

      Perhaps we (the humanity) want to ban autopilots that work in 99.9% of cases and kill the driver in 0.01%

      Well, rather than working with numbers pulled out of your exhaust pipe, why don't we look at some actual, real statistics? [electrek.co]
      Turns out the Tesla autosteer crash rate is about 0.08 crashes per million miles. Unfortunately I can't find any relevant statistics for human drivers for comparison (all my google searches come up with fatalities per million miles, not crashes per million miles.) However if you consider that a million miles represents a lifetime of driving for your average driver (10-20k miles per year) then I would say having less than a one-in-one hundred chance of reportable crash (which won't necessarily be fatal or even life-changing) in fifty to a hundred years of driving to be pretty good, and probably at least as safe as the average human driver.

      • (Score: 3, Interesting) by nobu_the_bard on Thursday January 25 2018, @02:50PM

        by nobu_the_bard (6373) on Thursday January 25 2018, @02:50PM (#627695)

        That seems probable. The fact that half the Tesla reports make the news probably causes the same problem airplane crashes have where it distorts the perception of how common they are.

      • (Score: 2) by AthanasiusKircher on Thursday January 25 2018, @03:12PM (2 children)

        by AthanasiusKircher (5291) on Thursday January 25 2018, @03:12PM (#627701) Journal

        Before I reply, note first that I mostly agree with your general point -- the Tesla "autopilot" stuff seems like it probably does a lot more good than bad. However, I'd argue about the standards you're using a bit...

        The safety board did note in its report that the introduction of Tesla's Autosteer software had cut collisions by 40 per cent.
        Sounds to me like Tesla's autopilot is actually pretty good. Good doesn't have to mean "no accidents at all", it just has to mean "as few or fewer accidents than a human driver".

        That shouldn't be the relevant standard for "good." One should also consider whether or not the system causes accidents that would not have occurred in the first place. A "good" system might cause a few new incidents in unexpected scenarios, but usually our standards shouldn't just about overall accident rate.

        To put this in a different context, say you had a daily supplement that "cuts fatal heart attacks by 40%." Sounds great, right? As a pure stat, it certainly sounds promising. But say I told you that in a given population, there were generally 1000 fatal heart attacks. And this supplement seemed to prevent all 1000 of those high-risk folks from having a heart attack. But it also CAUSED 600 fatal heart attacks to happen in otherwise relatively low-risk folks. Overall, it cut heart attack incidence in the population by 40%, but I don't think we'd call this a "good" drug... it's killing large numbers of people, even while saving others. The side effects may not be worth the benefit.

        To be clear, I'm NOT saying that's true here with Tesla. But it's an important reason why we should pay attention to cases that appear like they might be a failure of the Autopilot system to have reasonable behavior.

        I'll also agree with you if you argue that Tesla gets a lot of negative bad press for such incidents. But they also invite it. Musk tries to get all the media attention he can, and that means he's also going to get negative stuff when something bad happens. He also tends to get really defensive at any criticism. AND (perhaps most importantly), Tesla steadfastly refuses to alter the name of their "Autopilot" feature, despite the fact that it's clear huge numbers of people who hear that name misunderstand that it's mostly just enhanced cruise control. So, you can argue about the idiots who abuse it, but I think fewer idiots would abuse it if it had a different name. But that's Tesla's marketing decision -- they obviously think they'll get more attention and sell more cars with "Autopilot," so they have to suck it up when negative press comes along because idiots misunderstand that name.

        Well, rather than working with numbers pulled out of your exhaust pipe, why don't we look at some actual, real statistics? [electrek.co]
        Turns out the Tesla autosteer crash rate is about 0.08 crashes per million miles. Unfortunately I can't find any relevant statistics for human drivers for comparison (all my google searches come up with fatalities per million miles, not crashes per million miles.)

        Again, statistical comparisons should be done with care. It doesn't make much sense to compare an enhanced cruise control feature that's likely most used in open-highway situations to a general driving stat for humans (which includes high-density traffic situations where most crashes occur).

        Perhaps a more apt comparison would be to look at the number of crashes with "Autopilot" vs. the number of crashes in cars with humans using standard cruise control. That would probably be a more like-to-like comparison. I suspect "Autopilot" would do significantly better there too, because normal cruise control (like "Autopilot") tends to lead people to be more distracted while driving... but normal cruise control has no ability to respond, whereas "Autopilot" has more enhanced safety features.

        • (Score: 2) by GreatAuntAnesthesia on Thursday January 25 2018, @04:29PM (1 child)

          by GreatAuntAnesthesia (3275) on Thursday January 25 2018, @04:29PM (#627736) Journal

          Interesting points, but I don't think your heart attack analogy stands up.

          In your scenario, you have a clearly defined default position, a "natural" state : Everybody is at risk of a heart attack, some high risk and some low.
          I'd argue that in the world of cars, there is no natural state. Human drivers are the default, but only because that technology was invented first. Everybody has a heart1, but not everybody has to have a car. Indeed, one could imagine a society with no cars at all. We choose to have cars in our society, to pay their cost in lives for all the benefits and conveniences they bring. The autopilot isn't disrupting the natural order of things in the way your heart attack drug is. Or if it is, then the human drivers are too, and the only difference between the two options is the number of deaths.

          Put it this way: If we existed in some improbable alternate universe where Tesla Autopilot had been invented before manual controls, would we be sat here arguing whether putting humans behind the wheel would rightfully save some lives at the expense of many more?

          1Insert obligatory snark here about Rupert Murdoch / Dick Cheney / Donald Trump.

          • (Score: 2) by AthanasiusKircher on Thursday January 25 2018, @07:26PM

            by AthanasiusKircher (5291) on Thursday January 25 2018, @07:26PM (#627821) Journal

            Put it this way: If we existed in some improbable alternate universe where Tesla Autopilot had been invented before manual controls, would we be sat here arguing whether putting humans behind the wheel would rightfully save some lives at the expense of many more?

            I take your point. But in the very way you just framed that, you automatically are presuming a beneficial outcome. My point wasn't just about Tesla Autopilot (which I explicitly admitted likely prevents a lot more issues than it causes), but about judging such automated technologies in general.

            For example, many people who argue about completely autonomous cars phrase it as you did in your previous post -- i.e., we just get to the point that the accident stats are as good as average stats for human drivers to view them as a good alternative. But I don't think that'd be a comfort to someone who was killed by an autonomous car acting in a completely stupid manner because the bugs weren't worked out.

            Bottom line is that there will always be side effects to the adoption of new technology, and some of those may be negative. All I'm saying is that it's rational to factor that into judging whether the tech is "better" than humans. Lots of accidents are caused by STUPID human error that is largely preventable (e.g., speeding, following too closely, etc.). I tend to be a much more cautious and conservative driver than average, so quoting average accident rates is not going to convince me to put my safety in the hands of some algorithm.

            But even if the algorithm had the stats of a "good driver," I also want to know not only that it would successfully navigate potential accident scenarios better than I would in some cases, but that it's also not going to randomly kill me by doing something completely weird and unpredictable that I, as a driver, would never do. And if such latter scenarios were more than a freak accident -- that they actually occurred with some regularity -- are you really telling me that you'd want to put your safety in the hands of such an algorithm, just based on the promise that it "performs as good as the average human driver" or even slightly better in terms of overall accident stats?

            Again, I'm not arguing that Tesla's feature isn't helpful. Only that unexpected negative outcomes should be also be a serious factor to consider, along with summary stats.

      • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @03:22PM

        by Anonymous Coward on Thursday January 25 2018, @03:22PM (#627704)

        See update from Wired mag, link given above in post
        https://soylentnews.org/comments.pl?noupdate=1&sid=23744&page=1&cid=627660#commentwrap [soylentnews.org]

        Basically Tesla Autopilot doesn't react to stationary objects -- if a lead car moves over a lane to miss a stopped vehicle, the Tesla behind with Autopilot active will stay in lane -- Wired claims this is in the Tesla manual on the Autopilot system. Also claims that the Volvo automatic steering + automatic cruise control works the same way.

    • (Score: 2) by theluggage on Thursday January 25 2018, @11:44AM (1 child)

      by theluggage (1797) on Thursday January 25 2018, @11:44AM (#627643)

      Most importantly here, why the Tesla's autopilot is so bad?

      It's not that Autopilot is bad, its that we apes are actually very good at driving - when we're focussed - and emulating it is a really, really difficult challenge. Currently, they've picked the low-hanging fruit such as parallel parking or staying in lane on a highway designed for safe high-speed driving.

      What we brine sacks are not so good at is understanding that we can't maintain that performance while (e.g.) drunk, asleep, texting or otherwise away with the fairies - something that law enforcement and road safety campaigners have tried and failed to drum into us.

      So, given that it has proven so impossible to persuade some drivers to stay focussed and sober when they are solely responsible for driving, if you show someone a button called "Autopilot" then you might as well also hand them a bottle of vodka and some free credits on Candy Crush.

      A halfway beast is too dangerous.

      This. There's a quantum leap required between what is currently available and a system which is good enough to allow the meatbag to safely sit back and watch a movie, as meatbags will inevitably do. However, in this age of (fr)agile development, when one of the major self-driving players, Google, is also the acclaimed master of the perpetual beta, that's not what the industry wants to hear.

      • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @08:25PM

        by Anonymous Coward on Thursday January 25 2018, @08:25PM (#627836)

        I like that.
        I'll be looking for a place to re-use it. 8-)

        -- OriginalOwner_ [soylentnews.org]

    • (Score: 2) by DeathMonkey on Thursday January 25 2018, @06:38PM

      by DeathMonkey (1380) on Thursday January 25 2018, @06:38PM (#627796) Journal

      Here I presume that the driver said the truth and the autopilot was engaged.

      Clearly you've never supported end-users before!

  • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @04:28AM (6 children)

    by Anonymous Coward on Thursday January 25 2018, @04:28AM (#627546)

    The moon matrix tells us that Elon Musk is a murderer, and it's sheer luck one of his gadgets didn't kill multiple people here. Better be careful sitting on the same couch as the Musky One. He'll grab you by your whatever! His launch vehicles will cause manufacturing defects in their payloads!

    (Oh, and whatever you do, don't give him any of that military-industrial complex pork!)

    I mean, it's totally not like "the press" could instead use their megaphone to spread awareness to Tesla owners that auto-pilot doesn't mean self-driving and put the accountability for the collision squarely on the operator of the machine, who was clearly being criminally negligent, but instead we get FUD.

    Imagine how strange it would be if a drunk driver killed someone and "the press" ran a smear campaign against Budweiser! It was all the alcohol's fault, your honor! I'm innocent! Somebody should make a law!

    • (Score: 2) by realDonaldTrump on Thursday January 25 2018, @06:22AM (5 children)

      by realDonaldTrump (6614) on Thursday January 25 2018, @06:22AM (#627574) Homepage Journal

      I’ve watched people and I study people and I had, in particular, a great tutor on this, but I look and I see what it does to people when they lose control, and a lot of times they lose control. With alcohol or with drugs. A guy drinks too much and you have something HORRIBLE. Like Ted Kennedy. Like what happened to Dodi & Diana (God I wanted to bang her, but she crashed). Or like this guy in California. They made pot legal and this guy had a BAD car accident. He wasn't hurt, he could have been hurt. Nobody was hurt, somebody could have been hurt. The guy's car, the fire engine, maybe some damage to those. Probably some damage. Somebody's insurance is going up. Because somebody got high! They put a tax on pot, they think it'll help their economy. It won't help, folks. Because there's going to be another CRASH like this one. And the insurance folks -- the driver's insurance -- will go, "oh no, too many people are getting high, we need to raise EVERYONE'S rates."

      Let me tell you, when we hire a driver, we check his breath. And we do the drug test. Always, always.

      • (Score: 2, Insightful) by anubi on Thursday January 25 2018, @10:24AM (3 children)

        by anubi (2828) on Thursday January 25 2018, @10:24AM (#627618) Journal

        We are doing a pretty good job, I think, with nailing people who DUI, and I 100% agree with the penalties imposed. But I do believe something has to be done about all these "entertainment centers" in cars. I am not in the car to be "entertained"!. Driving is damned serious business.

        I'll offer a short rant on something on modern cars that really bugs me... the radio.

        Some of this technology is great... GPS, navigation aids, things to warn you if someone in in your blind spot. But some things in particular, like these computer controlled radios that can no longer be adjusted by feel, and I have to go through menus and lots of tiny little pushbutton crap, read fine print on a display, and the like, - this kind of stuff does NOT belong in a car! If I have a radio in the car, it HAS to be one I can adjust by feel. I have a new "modern" radio in my van, and I hate it - for that reason. I liked the one in my 40 year old car. Two knobs, and five large preset buttons, because DJ's often get carried away once they have a microphone in front of them, and forget we tuned in for some music.

        I will replace the bojangled thing with an old-school AM-FM thingie if I can find one that even an old man can use.

        --
        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
        • (Score: 1, Informative) by Anonymous Coward on Thursday January 25 2018, @11:13AM

          by Anonymous Coward on Thursday January 25 2018, @11:13AM (#627639)

          if I can find one

          Go to the boneyard. (Pick-a-Part)
          On a clapped out old heap, I would think that the radio would still work.

          DJ's often get carried away once they have a microphone

          I hate radio "personalities" too.
          Aside from the classical station, Bob Parlocha on the jazz station was the last good one and he's been dead almost 3 years.

          -- OriginalOwner_ [soylentnews.org]

        • (Score: 2) by takyon on Thursday January 25 2018, @04:34PM

          by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Thursday January 25 2018, @04:34PM (#627738) Journal

          What you need is Amazon Alexa: Car Edition.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
        • (Score: 2) by Freeman on Thursday January 25 2018, @05:46PM

          by Freeman (732) on Thursday January 25 2018, @05:46PM (#627770) Journal

          I would say we're pretty good with nailing the average citizen who DUIs. Money and Power still go a long way towards making some things disappear.

          As far as radio, etc. goes in modern vehicles. A good one has channel up/down and volume up/down on your steering wheel. Perhaps both of my vehicles are a decade out-of-date? Tiny push buttons, fine print displays, and the like should be operated when the vehicle is parked. You shouldn't be trying to reach over and access the glove compartment and fiddling with that unless you're parked, either.

          --
          Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
      • (Score: 0) by Anonymous Coward on Thursday January 25 2018, @11:04AM

        by Anonymous Coward on Thursday January 25 2018, @11:04AM (#627633)

        God I wanted to bang her, but she crashed

        It's not too late, Don. I mean you know where she's buried, and it's not like she's going to fight back. Or can you only get it up when they are struggling and shouting "no!"?

        Oh, and be careful who you hire to transport you around, even if their drug tests do check out: https://www.youtube.com/watch?v=CVZvUVTphW0 [youtube.com]

(1)