Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday May 17 2019, @09:48AM   Printer-friendly
from the keep-your-eyes-on-the-road-and-your-hands-on-the-wheel dept.

Tesla's advanced driver assist system, Autopilot, was active when a Model 3 driven by a 50-year-old Florida man crashed into the side of a tractor-trailer truck on March 1st, the National Transportation Safety Board (NTSB) states in a report released on Thursday. Investigators reviewed video and preliminary data from the vehicle and found that neither the driver nor Autopilot "executed evasive maneuvers" before striking the truck.

[...] The driver, Jeremy Beren Banner, was killed in the crash. It is at least the fourth fatal crash of a Tesla vehicle involving Autopilot.

This crash is eerily similar to another one involving a Tesla in 2016 near Gainesville, Florida. In that incident, Joshua Brown was killed when his Model S sedan collided with a semitrailer truck on a Florida highway in May 2016, making him the first known fatality in a semi-autonomous car.

The National Highway Traffic Safety Administration (NHTSA) determined that a "lack of safeguards" contributed to Brown's death. Meanwhile, today's report is just preliminary, and the NTSB declined to place blame on anyone.

Source: The Verge

Also at Ars Technica.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Friday May 17 2019, @11:32AM (10 children)

    by Anonymous Coward on Friday May 17 2019, @11:32AM (#844663)

    By now, students of self-driving car tech know that Tesla is using its customers for beta testing. There are bound to be a few casualties along the golden road to the autonomous future.

    Of course, not all the *customers* may realize this.

    • (Score: 1, Disagree) by Anonymous Coward on Friday May 17 2019, @12:27PM (7 children)

      by Anonymous Coward on Friday May 17 2019, @12:27PM (#844676)

      BUT can you really blame Tesla when the NHTSA, NTSB, etc did nothing to stop them placing an untested piece of technology on the roads like this?

      I mean assisted cruise control was one thing, but anything more advanced should have been required to advertise itself as the highest currently documented safe method of assisted cruise control and any 'autopilot' esque features should have required the drafting of safety and design documents before it was ever advertised or allowed to operate like a full lane assist, acceleration and deceleration system for any vehicle operating on public roads.

      Personally I think this is the turning point for Tesla (not the beginning of the end, which has already been happening for some time.) After this failure, lawsuits are going to come out of the woodwork and federal scrutiny is going to rain down on them, if not under this administration, then under the next. And when that time comes, it will be just like the 1960s all over again, and the time of incumbent automative manufacturers will return.

      • (Score: 4, Insightful) by All Your Lawn Are Belong To Us on Friday May 17 2019, @02:40PM (6 children)

        by All Your Lawn Are Belong To Us (6553) on Friday May 17 2019, @02:40PM (#844713) Journal

        Yes, you really can blame Tesla.
        And the NHTSA and NTSB, and Congress for all closing their eyes to it.
        There's more than enough blame to go around for a system that, by design, is unnecessary. As in, one is always supposed to remain attentive to the road at all times when on autopilot.... so why have it?
        How odd that two commercial airline crashes ground an entire fleet of aircraft. Two crashes of cars with autopilots (plus killing of pedestrians,) and the regulatory response is, "meh".

        --
        This sig for rent.
        • (Score: 4, Interesting) by Knowledge Troll on Friday May 17 2019, @02:58PM (4 children)

          by Knowledge Troll (5948) on Friday May 17 2019, @02:58PM (#844723) Homepage Journal

          one is always supposed to remain attentive to the road at all times when on autopilot.... so why have it?

          Only once have I seen a Tesla fan get this one right. They knew exactly what they were doing: supervising and training a robot. He was very clear and knew exactly what the task was. He wasn't using the car to automatically get someplace he was truly supervising the car and providing feedback hoping that it improves the Tesla fleet as a whole.

          That's one out of dozens of fans I've seen.

          • (Score: 0) by Anonymous Coward on Friday May 17 2019, @05:13PM (3 children)

            by Anonymous Coward on Friday May 17 2019, @05:13PM (#844778)

            Yes, I know one couple that also use their Tesla Model S Autopilot this way. Both are "software royalty", have top level research jobs at top university artificial intelligence labs.

            All the other Tesla owners I know...well, let's just say that they won't become my close friends--I'd hate to lose them but don't want the heartache of losing a really close friend this way.

            • (Score: 5, Insightful) by edIII on Friday May 17 2019, @09:04PM (2 children)

              by edIII (791) on Friday May 17 2019, @09:04PM (#844836)

              This I understand. I don't have a choice though. I've already become close to people that have lane change assist technology as well as this auto-pilot crap, and not in Tesla's either.

              What makes it concerning is that they have an attitude about it that they can stop paying attention. One friend bragged saying he didn't lift his eyes from his movie playing on a tablet from Los Angeles to Las Vegas. They think these things are fucking Johnny Cabs from Total Recall, and they are not anywhere close to that level of tech yet. Musk can burn in hell for taking advantage of these people as his personal guinea pigs for testing.

              It was insanely premature to allow this tech on the road. We need 5 years of extensive testing with hundreds of vehicles on a closed testing track before we can confidently say anything about the performance of the tech. They've done, IMO, less than 1% of the regression testing required. What's worse, is that failures should mandate legal disabling of these features until they go through several more cycles of development and testing.

              ALL OF THAT is an ideal world. We've not even got to the security concerns yet, and there was an article here recently about placing special stickers on the road to deliberately confuse these systems. So they not only need to perform extensive regression testing (all kinds of vehicles and situations tested), but also need to account for malicious activity. Unless we want a cyber-terrorist to throw an AI-bombs onto the freeway.

              Reminds of that scientist in Spider Man, "We need to go back to formula".

              --
              Technically, lunchtime is at any moment. It's just a wave function.
              • (Score: 4, Insightful) by coolgopher on Saturday May 18 2019, @12:53AM (1 child)

                by coolgopher (1157) on Saturday May 18 2019, @12:53AM (#844894)

                It was insanely premature to allow this tech on the road.

                I'd also say that at least a quarter of the human drivers on our roads were allowed on insanely prematurely.

                In the end, I figure it kinda squares out. Humans do really dumb and dangerous things, robots do really dumb and dangerous things. *shrug* As a driver it's my job to be prepared for others doing dumb and dangerous things, and ideally not inflict such dumb and dangerous stuff on others.

                • (Score: 2) by Bot on Saturday May 18 2019, @09:45PM

                  by Bot (3902) on Saturday May 18 2019, @09:45PM (#845138) Journal

                  >robots do really dumb and dangerous things

                  systemd made me do it

                  --
                  Account abandoned.
        • (Score: 2) by etherscythe on Sunday May 19 2019, @05:05PM

          by etherscythe (937) on Sunday May 19 2019, @05:05PM (#845268) Journal

          <sarcasm>Horses work great getting you from one place to another - why have cars at all?</sarcasm>

          I think the purpose of the driver assist is pretty great - reduce deaths in traffic due to human error, which is one of the biggest killers in first world countries today.

          That said, I bet we have a lot to agree on: I also think using the name "autopilot" for a system which is essentially only a safety watchdog with secondary system control was a huge mistake on Tesla's part, especially in beta status. Don't call it autopilot until the car is literally driving completely by itself, and we've tested it well enough to feel at least somewhat safe with it legally. The current name gives people completely the wrong impression how how the feature is meant to be used. The documentation says one thing, but naming it Autopilot essentially says, "yeah, we only included that language because we were required to by law. We here in the cool kids club know that what it's really for is showing off to your girlfriend/drinking buddies how awesome you are by letting the car drive by itself and not pay attention, because you're so rad you can afford that car of the future."

          One of my coworkers recently bought an older used Model S, and I'm a little disturbed by how casual he is in letting the car do what it wants to do on Autopilot v1 hardware. On the other hand, I'm also impressed by the collision which it has already avoided by slightly distracted driving (not watching several cars ahead) when he still had his hands on the wheel and eyes on the road and the car in front of him slammed on its brakes suddenly in response to slowing traffic ahead. This speaks to what the vision really was - help people look out for themselves, because digital billboards shift, solicitors/homeless stand with signs dangerously close to the road, rising/setting sun glares right in your eyeballs, or you just didn't sleep well last night, and any number of other temporary debilitations or distractions occur which really increase the risk of injury on the road.

          Tesla was bound to make some mistakes, but I think some of these collisions were easily preventable had this branding decision not been made recklessly in the name of marketing, and overexaggerating on a feature just to stick another thumb in the eye of the established automakers. Musk is a central figure in that aspect of the business, and I will absolutely call him out on it.

          --
          "Fake News: anything reported outside of my own personally chosen echo chamber"
    • (Score: 0) by Anonymous Coward on Friday May 17 2019, @07:42PM (1 child)

      by Anonymous Coward on Friday May 17 2019, @07:42PM (#844821)

      How about the issue of false advertising. A "driver assist" is much different than "auto-pilot".

      • (Score: 2) by Bot on Saturday May 18 2019, @09:48PM

        by Bot (3902) on Saturday May 18 2019, @09:48PM (#845140) Journal

        false advertising alright but let us not assume Musk's real motives. For example, a factually accurate slogan, burma shave style.

        Tesla

        increasing the average IQ of the nation

        crash by crash

        --
        Account abandoned.
  • (Score: 3, Interesting) by Arik on Friday May 17 2019, @01:16PM (9 children)

    by Arik (4543) on Friday May 17 2019, @01:16PM (#844684) Journal
    Indeed. I'm thinking specifically of safeguards that should prohibit the use of the public roads for such dangerous experiments in the first place.

    --
    If laughter is the best medicine, who are the best doctors?
    • (Score: 5, Insightful) by JoeMerchant on Friday May 17 2019, @01:58PM (8 children)

      by JoeMerchant (3937) on Friday May 17 2019, @01:58PM (#844701)

      such dangerous experiments

      Like letting 16 year olds drive? Public roads are not safe, period. Never have been, and as technology and driver training has improved their safety, overcrowding has reduced it at a near matching pace.

      --
      🌻🌻 [google.com]
      • (Score: 2, Interesting) by Anonymous Coward on Friday May 17 2019, @02:33PM

        by Anonymous Coward on Friday May 17 2019, @02:33PM (#844710)

        My father had access to an off road area to teach me to drive, which he did. I was 5 years old when I "solo-ed" in a stick shift vehicle. I believe that lots of farm kids have a similar experience. Helps a lot to know the basics of vehicle control when it's time to get licensed (at 16 or whatever age).

      • (Score: 2) by edIII on Friday May 17 2019, @09:05PM (6 children)

        by edIII (791) on Friday May 17 2019, @09:05PM (#844837)

        All the more reason to restrict the testing and development close test tracks with extensive regression testing.

        --
        Technically, lunchtime is at any moment. It's just a wave function.
        • (Score: 3, Interesting) by JoeMerchant on Friday May 17 2019, @09:41PM (5 children)

          by JoeMerchant (3937) on Friday May 17 2019, @09:41PM (#844846)

          I interact with Special Needs individuals (Downs' Autism. etc.) who need to drive to support their independence. Even the Uber self-driving program isn't as scary as them, and there are tens, perhaps hundreds of thousands of them nationwide actively driving.

          What's wrong with both (automated and special people) is that they are harder to predict - when you commute on the freeway, you're out there with thousands of relatively predictable people, even if they aren't following "the rules" they are "safe" because you can reliably predict what they're going to do. It's not too different from rookie driving programs in motorsports - they learn how to compete without getting everybody killed by experience on track with the target population.

          Of course, driving in the US is "safe" like drinking water from the Ganges is "safe" - odds are not too terrible, but big risk is always out there.

          --
          🌻🌻 [google.com]
          • (Score: 3, Informative) by lentilla on Friday May 17 2019, @11:25PM (1 child)

            by lentilla (1770) on Friday May 17 2019, @11:25PM (#844877)

            Your post has me horrified. I don't care how much someone "needs" to drive to "support their independence", if they can't drive properly they should not be allowed to drive. That goes for special needs people, aged people, recalcitrant drunk drivers and those that simply can't or won't learn to drive in a moderately skilled and safe manner.

            The only reason we tolerate learners (youngsters) is that they learn quickly and are supervised whilst doing so.

            • (Score: 2) by JoeMerchant on Saturday May 18 2019, @12:08PM

              by JoeMerchant (3937) on Saturday May 18 2019, @12:08PM (#844991)

              if they can't drive properly

              Well, that's the question, isn't it? And, the proof is in the accident rate. At least among the SN population I know, they have a roughly average "starting driver" accident rate - one or two minor collisions in the first 5 years, tapering down with experience.

              However, I think their parents/caregivers agree: it's pretty terrifying at first. On the other hand, if they live in rural Alabama and they can't drive, that makes them 100% dependent on other transportation, which - given the historical accident rate - is unjustified.

              I will say, there are some who try it and give it up, because they never do learn to drive well - which is more common among the SN population than the "normies" - there are plenty of "normies" who get into a dozen crashes or more and still drive.

              --
              🌻🌻 [google.com]
          • (Score: 2) by edIII on Saturday May 18 2019, @12:27AM (2 children)

            by edIII (791) on Saturday May 18 2019, @12:27AM (#844889)

            I am likewise horrified. Under no circumstances should somebody with special needs like that be able to drive, and their independence is a wholly insufficient reason to introduce such risks to the rest of us. People don't respect what driving *is*, period. It isn't a right, but a privilege, and absolutely must be looked at for what it truly is; The operation of a multi-ton vehicle at high speeds (30 is still a high speed).

            What the heck is wrong with disabled transit services? Like you, I've helped people with disabilities. However, I'm the one driving, not them. I helped them achieve independence by providing bus passes and teaching them the bus routes to get to where they need to go.

            Truly, I haven't been this shocked and horrified since Orange Anus was elected dictator of the US. People with Down Syndrome driving? Really? Sweet Jesus save us all.... This is why we desperately need a fundamental shift in how we construct cities and neighborhoods to encourage and allow spaces for walking, biking, etc. The people you help should find independence in a safer environment and setting than multi-ton vehicles on a freeway.

            --
            Technically, lunchtime is at any moment. It's just a wave function.
            • (Score: 2) by JoeMerchant on Saturday May 18 2019, @12:14PM

              by JoeMerchant (3937) on Saturday May 18 2019, @12:14PM (#844992)

              I am likewise horrified. Under no circumstances should somebody with special needs like that be able to drive, and their independence is a wholly insufficient reason to introduce such risks to the rest of us.

              Your prejudice is common, and completely understandable.

              Overall the special needs driving accident rate isn't much different from the general population, and there are more "non special needs" drivers in the general population with a much higher accident rate who the courts continue to allow to drive because the courts deem their independence (employability) as a wholly sufficient reason to continue giving them chances to injure the rest of us.

              As I mentioned elsewhere - the main difference I have seen, long term, in the special needs driving population is that they are more conservative when they start driving, and the ones who can't seem to manage it well are more likely to retire from driving than a "non special" person is.

              --
              🌻🌻 [google.com]
            • (Score: 2) by JoeMerchant on Saturday May 18 2019, @12:30PM

              by JoeMerchant (3937) on Saturday May 18 2019, @12:30PM (#844994)

              What the heck is wrong with disabled transit services?

              Rural Alabama? Even in a big city with good service, what would be a 15 minute trip to the store turns into a 3 hour scheduled ordeal, with a fair chance of not happening until tomorrow, or the next day. I will say, most of the special needs drivers I know are doing it in more rural settings, though some have moved into the bigger cities successfully, and some have retired from driving when they got to city traffic.

              we desperately need a fundamental shift in how we construct cities and neighborhoods to encourage and allow spaces for walking, biking, etc. The people you help should find independence in a safer environment and setting than multi-ton vehicles on a freeway.

              Special needs has got nothing to do with it, our current city architecture is highly resource consumptive, and subjects us all to far too much risk of serious bodily harm - not to mention the daily grind (waste of time) of commuting long distances to balance home price vs location. The Police called it in 1983, and we haven't done a thing to improve it in the 36 years since:

              Another working day has ended.
              Only the rush hour hell to face.
              Packed like lemmings into shiny metal boxes.
              Contestants in a suicidal race.
              Daddy grips the wheel and stares alone into the distance,
              He knows that something somewhere has to break.
              He sees the family home now looming in his headlights,
              The pain upstairs that makes his eyeballs ache.

              --
              🌻🌻 [google.com]
  • (Score: 5, Interesting) by The Shire on Friday May 17 2019, @01:24PM (22 children)

    by The Shire (5824) on Friday May 17 2019, @01:24PM (#844688)

    It's worth noting that

    1) The driver was doing 68 in a 55mph zone
    2) The autopilot had just been turned on 10 seconds before the accident
    3) The truck driver made a left turn across the highway infront of the Tesla

    The drivers hands were not on the wheel, his eyes were not on the road, and he was speeding through an interesection, coupled with the truck driver who did not yield as he entered the intersection.

    This was not an accident that any human driver would have survived let alone an autonomous vehicle.

    • (Score: 4, Insightful) by Rivenaleem on Friday May 17 2019, @01:35PM (2 children)

      by Rivenaleem (3400) on Friday May 17 2019, @01:35PM (#844692)

      How presumptuous would it be to picture a scene where the driver wants something from the back seat, so turns on Autopilot and proceeds to rummage about in the back, with no idea whatsoever what is going on on the road ahead of him?

      • (Score: 2) by JoeMerchant on Friday May 17 2019, @02:06PM

        by JoeMerchant (3937) on Friday May 17 2019, @02:06PM (#844702)

        picture a scene where the driver wants something from the back seat

        That's called the real world - which is something entirely different from lawyers, lawsuits, courtrooms, etc.

        --
        🌻🌻 [google.com]
      • (Score: 2) by bob_super on Friday May 17 2019, @07:30PM

        by bob_super (1357) on Friday May 17 2019, @07:30PM (#844818)

        In a Level 2 system, that's a potentially fatal mistake. I'm guessing he just checked his phone.

        Trucks on this kind of 55MPH highways just pull out when they see a halfway decent opening, and expect cars to adjust speeds and directions to avoid hitting the giant obstacle lumbering across. I see it almost every day. Given the place of impact (while speeding), that particular occurrence did require hitting the brakes, and potentially swerving to the right shoulder.

    • (Score: 1, Interesting) by Anonymous Coward on Friday May 17 2019, @01:45PM (3 children)

      by Anonymous Coward on Friday May 17 2019, @01:45PM (#844696)

      From the Verge link,

      In a statement, Tesla confirmed that series of events. “We are deeply saddened by this accident and our thoughts are with everyone affected by this tragedy,” a Tesla spokesperson said. “Tesla drivers have logged more than one billion miles with Autopilot engaged, and our data shows that, when used properly by an attentive driver who is prepared to take control at all times, drivers supported by Autopilot are safer than those operating without assistance. For the past three quarters we have released quarterly safety data directly from our vehicles which demonstrates that.”

      It all falls apart when they say, "...by an attentive driver who is prepared to take control at all times," -- the problem is that this doesn't exist, anyone not actively driving is effectively not able to quickly assess the driving situation (get up to speed?) and make reasonable decisions.

      Similar thing with the paid driver in the Uber accident in Tempe, AZ, the temptation is too strong to be distracted.

      • (Score: 3, Insightful) by JoeMerchant on Friday May 17 2019, @01:56PM (2 children)

        by JoeMerchant (3937) on Friday May 17 2019, @01:56PM (#844700)

        when used properly by an attentive driver who is prepared to take control at all times

        So, space aliens can use the Tesla autopilot safely? I do not know a single human being who could make a 2 hour road trip with the Autopilot engaged and maintain attention / preparation to take control at all times. Some backseat drivers might approach 75% readiness, but even their attention lapses when their hands aren't on the wheel.

        --
        🌻🌻 [google.com]
        • (Score: 3, Funny) by lentilla on Friday May 17 2019, @11:34PM (1 child)

          by lentilla (1770) on Friday May 17 2019, @11:34PM (#844879)

          Some backseat drivers might approach 75% readiness but even their attention lapses when their hands aren't on the wheel.

          Hmmm. They should have taken their mother-in-law on the trip. That's 110% attention to detail right there.

          • (Score: 2) by JoeMerchant on Saturday May 18 2019, @12:19PM

            by JoeMerchant (3937) on Saturday May 18 2019, @12:19PM (#844993)

            What needs to be noted is that even with hands on the wheel, almost noone maintains 100% attention to the full situation, to the front, the sides, all potential collisions in the case that somebody does something unexpected.

            Hell, around here (Florida) you're lucky if 5% of the drivers around you aren't staring down at their phone screens at any given moment. Try this: while you're sitting at a light, take a good look into the oncoming cars as they go by - try to count 20 in a row that appear, from the outside, to be paying full attention - not holding/looking into a phone, down at the radio, applying makeup, turned to look at a passenger while talking to them, etc. I don't think you'll get to 20, most days you won't even get to 10.

            --
            🌻🌻 [google.com]
    • (Score: 4, Interesting) by JoeMerchant on Friday May 17 2019, @01:54PM (9 children)

      by JoeMerchant (3937) on Friday May 17 2019, @01:54PM (#844697)

      The driver was doing 68 in a 55mph zone... The drivers hands were not on the wheel, his eyes were not on the road, and he was speeding through an interesection, coupled with the truck driver who did not yield as he made a left turn across the highway infront of the Tesla.

      Sounds like an average day driving in Florida, to me. The real question is: did the Tesla autopilot do better or worse than an average 50-year-old Florida man actively driving the vehicle without an autopilot would have?

      The autopilot had just been turned on 10 seconds before the accident

      10 seconds is a relative eternity at 68mph, and 68 in a 55 is just a normal day on most Florida roads.

      My 70 year old father has likely driven 2 million miles (maybe as much as the Tesla autopilots so far?), and for his 70th birthday a hiked up pickup truck did not yield as it turned left across an intersection just a moment before my father was entering at 50mph. Airbags worked, but he had a severely broken femur and plenty of other crash damage which left him with severely reduced mobility for over a year, and of course will never fully recover.

      --
      🌻🌻 [google.com]
      • (Score: 2) by AthanasiusKircher on Friday May 17 2019, @03:56PM (6 children)

        by AthanasiusKircher (5291) on Friday May 17 2019, @03:56PM (#844753) Journal

        10 seconds is a relative eternity at 68mph

        Indeed -- roughly 1000 feet, or an entire football field (plus most of the endzones) in distance. This is a calculation every person who ever thinks of texting or even just doing something brief on their phone should do. A lot of stuff can change on the road in 1000 feet, but people frequently engage in activities that pull their attention from the road for several seconds.

        Hence why driving is still the most dangerous activity the average person participates in on a regular basis. It's not something to cavalier about.

        • (Score: 2) by JoeMerchant on Friday May 17 2019, @04:21PM (4 children)

          by JoeMerchant (3937) on Friday May 17 2019, @04:21PM (#844763)

          It's not something to cavalier about

          Absolutely, I much prefer an Impala, or a tricked out Malibu. ;-) (OUS: Google "Chevrolet Models")

          When I'm commuting on 3-5 lanes each way of interstate to-from work, speed limit 65, commonly observed slowest vehicles moving up to 80mph, I'm continually shocked at the number of commuters who are following at less than 1 second distance to the car in front.

          I'm no longer shocked at the daily accidents, and the traffic jams they regularly create. Fatal accidents are fairly rare, but accidents that could have been fatal with a little worse luck are all too common.

          --
          🌻🌻 [google.com]
          • (Score: 4, Insightful) by AthanasiusKircher on Friday May 17 2019, @05:52PM (3 children)

            by AthanasiusKircher (5291) on Friday May 17 2019, @05:52PM (#844796) Journal

            I'm continually shocked at the number of commuters who are following at less than 1 second distance to the car in front.

            I think it was about 4 years ago that I witnessed an accident in the left lane on the highway about 4 cars in front of me. I can't remember what precipitated it, but the 4-5 cars in front of me piled up, one of them being run off the road and another being hit and spun completely around on the highway. I had left a reasonable cushion in front of me (as I always do), but still barely had a chance to come to a stop as I swerved out of the lane.

            Just a few minutes before, I remember being tailgated by one of those cars, which had sped around me and passed me on the right, simply because I wasn't willing to tailgate the car in front of me (I wasn't going slower than the traffic in front of me -- just leaving a reasonable gap).

            For many years I always tried to follow the rule about leaving a gap, but now I'm even more certain to do it. Unfortunately, most people I think never experience something like this and just assume all will be okay, because they tailgate all the time and nothing bad has happened to them.

            I don't really understand why the police don't ticket more people for tailgating, as it's generally a lot more dangerous and likely to cause accidents than minor speeding or other minor stuff people often get pulled over for. Maintaining a safe following distance is actually the one reason I look forward to the days of autonomous cars (which unfortunately I think are much further in the future than most people seem to think, at least without dedicated roads for them). But nothing will stop idiocy: I'm sure an autonomous car maintaining a safe following distance will get passed and cut off in even more dangerous maneuvers, as people become impatient as if filling in the constant gap in front of them will somehow get them to their destination more than a few seconds faster.

            The ironic thing for me is how it's those idiots who actually cause most traffic jams. Not just with accidents, but by following too closely and thereby necessitating huge braking maneuvers in strings of cars with just one minor error, merge, cut-off, etc. If you maintain a reasonable distance and something like that happens, you probably don't even need to brake -- just lay off the accelerator, and you and everyone behind you keeps moving. Tailgate and then you have to brake hard, and all the dozen tailgaters behind you brake hard, setting up accordion-like traffic waves that propagate backward and end up screwing up everyone's commute so we all take twice as long to get home. But hey -- at least you didn't allow that darn gap of a few car lengths to be open in front of you!

            • (Score: 3, Informative) by bob_super on Friday May 17 2019, @07:36PM (1 child)

              by bob_super (1357) on Friday May 17 2019, @07:36PM (#844819)

              > I remember being tailgated by one of those cars, which had sped around me and passed me on the right

              I do have to point out that if a tailgater passed you on the right, the smart thing to do would have been to merge right as soon as you could to let the tailgater go.
              He's an asshole (pretty safely assuming it's a guy), but you shouldn't have stayed in the left lane in front of an asshole, if there was enough space to GET BACK TO THE RIGHT.

              • (Score: 2) by AthanasiusKircher on Thursday May 23 2019, @01:34AM

                by AthanasiusKircher (5291) on Thursday May 23 2019, @01:34AM (#846470) Journal

                It was heavy rush hour traffic. The car took advantage of a rare gap and cut people off to do it.

                Everyone was tailgating. Hence the pileup. It is standard rush hour practice in all lanes in some areas.

                Normally I do what you recommend when I can do it safely. I could not. He did not.

            • (Score: 4, Interesting) by lentilla on Friday May 17 2019, @11:46PM

              by lentilla (1770) on Friday May 17 2019, @11:46PM (#844881)

              following at less than 1 second distance to the car in front

              I have driven a number of cars with active radar that will follow the car in front and leave an appropriate amount of distance to the preceding car (the distance is even adjustable). These are higher-end models but the technology will find its way into regular models in short order. I would like to see this implemented directly in to the accelerator (gas pedal): even if active cruse control is turned off, if you follow too closely, a servo on the accelerator pedal actively pushes back. So; yes; you can follow the car in front leaving a microsecond gap, but you have to constantly work hard to do so. My theory is that most people won't put in the effort, and once "everyone" is leaving an appropriate safety buffer, the drive to tailgate will be greatly diminished.

        • (Score: 2) by deimtee on Saturday May 18 2019, @09:54AM

          by deimtee (3272) on Saturday May 18 2019, @09:54AM (#844972) Journal

          I think you mixed feet and yards in there somewhere. 68 mph is 30m/s. By 10 seconds is 300 metres or 330 yards.
          That's about three american football fields.

          --
          If you cough while drinking cheap red wine it really cleans out your sinuses.
      • (Score: 2) by The Shire on Friday May 17 2019, @11:20PM (1 child)

        by The Shire (5824) on Friday May 17 2019, @11:20PM (#844873)

        But is 10 seconds enough time for the AI to get it's bearings sufficiently to handle a sudden impact situation? Clearly it is not.

        • (Score: 2) by JoeMerchant on Saturday May 18 2019, @12:03PM

          by JoeMerchant (3937) on Saturday May 18 2019, @12:03PM (#844990)

          Autopilot or not, if you're doing 68 mph on a highway where vehicles can suddenly and unexpectedly turn left in front of you... that can be an unavoidable accident.

          --
          🌻🌻 [google.com]
    • (Score: 2) by All Your Lawn Are Belong To Us on Friday May 17 2019, @02:55PM (2 children)

      by All Your Lawn Are Belong To Us (6553) on Friday May 17 2019, @02:55PM (#844721) Journal

      No, but it is likely that it is an accident that most human drivers would not have allowed to happen if their eyes were on the road and the hands were on the wheel. Could a driver have still been distracted and gotten into that accident? Yes... a sister of my brother-in-law and her spouse died exactly that way in the 70s on I-94. But a human could have avoided this accident that the machine did not.

      And if you've seen the photos of the vehicle (see my next post with the NTSB link) it looks like the body of the vehicle passed under the trailer / decapitated the top half... perhaps an alert driver would have ducked into the passenger side below the dash.

      --
      This sig for rent.
      • (Score: 3, Interesting) by tangomargarine on Friday May 17 2019, @03:32PM (1 child)

        by tangomargarine (667) on Friday May 17 2019, @03:32PM (#844740)

        And if you've seen the photos of the vehicle (see my next post with the NTSB link) it looks like the body of the vehicle passed under the trailer / decapitated the top half... perhaps an alert driver would have ducked into the passenger side below the dash.

        Would that actually save your life? I'd be worried about the airbag going off killing you anyway, as they're designed with the assumption that your head is in a reasonable position when they go off, not at waist level, aren't they? Hit from the wrong angle they can break your neck.

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 2) by bob_super on Friday May 17 2019, @09:34PM

          by bob_super (1357) on Friday May 17 2019, @09:34PM (#844843)

          The car stopped about 500 yards later.
          The airbags kill you by blowing parts of you backwards/sideways at the same time that you get sent forwards suddenly by a crash.
          In this case, there wasn't a sudden negative acceleration (well, not much), so I don't believe the airbag injuries would likely be fatal, especially since ducking towards the passenger seat puts you below the zone of maximum airbag impact.
          Whether someone's cellphone still gets crushed into vital organs is an exercise left to the reader.

    • (Score: 2) by Nuke on Friday May 17 2019, @09:56PM

      by Nuke (3162) on Friday May 17 2019, @09:56PM (#844848)

      The Shire wrote :-

      The drivers hands were not on the wheel, his eyes were not on the road, and he was speeding through an interesection, coupled with the truck driver who did not yield as he entered the intersection. This was not an accident that any human driver would have survived let alone an autonomous vehicle.

      That is not an accident I would have had.

    • (Score: 2) by Magic Oddball on Friday May 17 2019, @10:20PM

      by Magic Oddball (3847) on Friday May 17 2019, @10:20PM (#844858) Journal

      Ten seconds doesn't initially sound like very long, but if you actually try counting it out, it turns out to be a pretty substantial amount of time. It would certainly be long enough to slow down and swerve, both of which would greatly increase the chances of survival. Ideally, the driver would react and the Tesla's software would readjust things (e.g. as anti-lock brakes do) to be safer; even if the driver doesn't (or can't) react, however, the software should recognize the hazard and take action of some kind.

      68mph in a 55mph zone isn't particularly fast, either. The posted speed limit for the main highway near my home is 65mph, but there's so many cars going 80+mph that I have to frequently check my speedometer to avoid letting my own speed creep up from 75 to match theirs.

  • (Score: 3, Insightful) by Anonymous Coward on Friday May 17 2019, @01:56PM (4 children)

    by Anonymous Coward on Friday May 17 2019, @01:56PM (#844699)

    Were it any brand other than Tesla, this would be a non-story about two bad drivers. One bad driver cut off traffic with his truck. The second bad driver engaged cruise control and then stopped paying attention to the car's driving, running into the first bad driver.

    But no, there's a Tesla in the mix, so OMG, Elon Musk must be to blame.

    • (Score: 2) by All Your Lawn Are Belong To Us on Friday May 17 2019, @03:01PM (2 children)

      by All Your Lawn Are Belong To Us (6553) on Friday May 17 2019, @03:01PM (#844724) Journal

      Were it any brand other than Tesla human controlled vehicle without an autopilot, this would be a non-story about two bad drivers.

      FTFY.

      --
      This sig for rent.
      • (Score: 0) by Anonymous Coward on Friday May 17 2019, @04:20PM (1 child)

        by Anonymous Coward on Friday May 17 2019, @04:20PM (#844762)

        Were it any brand other than Tesla human controlled vehicle without an autopilot, this would be a non-story about two bad drivers.

        FTFY.

        There was nothing wrong with the original phrasing. The so-called "autopilot" is nothing more than a sexily-named glorified cruise control. Blame Tesla marketing for that. The driver is expected to stay focused on driving. Anyone dumb enough at this point to believe that the sexy name means full autonomous driving just means that Tesla has found a means of identifying people ripe for natural deselection.

        • (Score: 3, Insightful) by Nuke on Friday May 17 2019, @10:05PM

          by Nuke (3162) on Friday May 17 2019, @10:05PM (#844851)

          Yes, yes, we know that "The so-called "autopilot" is nothing more than a sexily-named glorified cruise control", as you rightly say. But the point is that the focus of the story would still have been the incapability of computer controlled cars, whatever name you give the system and whatever the name of the company that made it.

    • (Score: 1) by bmimatt on Friday May 17 2019, @04:12PM

      by bmimatt (5050) on Friday May 17 2019, @04:12PM (#844757)

      It still would've been a story. The title would be something like: "Florida Man Just Crashed a Tesla". It is, after all, Florida Man we are talking about.

  • (Score: 5, Interesting) by iamjacksusername on Friday May 17 2019, @02:10PM (6 children)

    by iamjacksusername (1479) on Friday May 17 2019, @02:10PM (#844704)

    Supplemental autonomy in cars is generally good - anti-lock brakes, cruise control, traction control - all of these autonomous systems are generally very helpful to a driver. however, they do not change the fundamentals of driving. You still need to break to stop, you still need to control your car when changing speeds, you still need to control your turns and speeds. As we move through the layers of autonomy, L2 and L3 as defined by DOT, we see the lines start to blur. We are requiring people to supervise the autonomous systems and decide when to intervene. This is something most people are not good at; it is doable - pilots have been functioning with L2 and L3 levels of autonomy for years - but it requires a significant investment in training as well as motivation on the part of the student.

    My anecdotal experience working in IT, is that most people will accept whatever is "good enough" for the lowest-effort (for them). If the company calls something autopilot, people are most definitely going to treat it as such. If a system requires no active participation from the user, the user will not participate. Autopilot, or more accurately cruise control + assisted steering, is dangerous because it seems "good enough" to people when in reality it is far from good enough.

    A counter-point to the safety issues of Autopilot is 4-wheel steering as implemented by GM in the mid-2000s. GM had been experimenting and testing this for the better part of 20 years before they actually implemented it in production. Every person I met who had this option on their trucks absolutely loved it. GM took the time to implement it correctly so that it was a self-contained supplemental autonomous system, with defined behaviors. Essentially, it worked as reliably as anti-lock brakes. However, implemented imperfectly, it would have been a disaster - think fully loaded pickup trucks flipping themselves at 70mph on a highway. As I recall, there were built in speed interval behavior changes - it worked different ways at different speeds - as well as a few other parameters which I cannot recall. Essentially, the driver drove as they normally would and the autonomous system handled the details. It was only discontinued due to expense- it was an expensive option not many people got and it was very expensive to insure due to repair costs.

    My point is that I strongly believe we need to progress from L1 automation directly to L4, if not L5. I tend to think the two levels will be implemented at roughly the same time so the distinction between L4 and L5 is meaningless- again, lowest common denominator. L5 requires navigation of unimproved and dirt roads; I would not trust most city drivers on a dirt road so when we reach L4 we will be, for all practical policy considerations, at full automation. However, the intermediate levels are too dangerous to implement in production as they will actively encourage detrimental behaviors on the drivers' part - inattentiveness, carelessness, driving impaired (we have a problem with this already - semi-autonomous cars will make this worse) - that will lead to worse outcomes.

    • (Score: 2, Informative) by Anonymous Coward on Friday May 17 2019, @02:40PM (1 child)

      by Anonymous Coward on Friday May 17 2019, @02:40PM (#844712)

      Just one quick note on your nice post:
      > L2 and L3 as defined by DOT

      L2 and L3 as defined by SAE (Society of Automotive Engineers). DOT may be using the initial SAE definitions? Last I heard, I think these definitions are currently being refined by the appropriate SAE committee.

    • (Score: 0) by Anonymous Coward on Friday May 17 2019, @05:34PM

      by Anonymous Coward on Friday May 17 2019, @05:34PM (#844786)

      > Supplemental autonomy in cars is generally good - anti-lock brakes, ...

      For future reference, all of these "nanny" assist systems (some better than others, imo) are currently lumped as "ADAS" -- advanced driver assistance systems.

      Huge amounts of proving ground testing has fostered a whole new industry making specialized equipment for ADAS testing. It started with driving robots that can steer/brake (and clutch/shift for manual cars) and are linked to path following nav systems. These can be linked with radios and a central controller so that their movements are synced. The test designer can choreograph a wide range of scenarios with multiple vehicles (and repeat them to differential GPS accuracy, about 1 cm).

      Then, there are all kinds of "targets" that are also programmable, some are small electric vehicles that are only a few inches high, these have tapered edges so a full size car/truck can run right over them with no damage. On top of these moving platforms could be a dummy pedestrian or cyclist to test computer vision systems. Larger versions hold a dummy body of a car made of light plastic and fabric, with foil (etc) so that they have the visual and radar cross section of a normal car; no harm done if they are hit, just snap back together. There are also rail systems that lay on the ground (similar to a curtain rod pull cord) and move a dummy into the road at specified time & speed.
       

    • (Score: 2) by Magic Oddball on Friday May 17 2019, @10:33PM (1 child)

      by Magic Oddball (3847) on Friday May 17 2019, @10:33PM (#844863) Journal

      Upvoted, though I found this typo amusing:

      You still need to break to stop

      ...and if you don't brake to stop, then "break" is exactly what you'll be doing. :o)

      • (Score: 2) by dry on Saturday May 18 2019, @05:46AM

        by dry (223) on Saturday May 18 2019, @05:46AM (#844944) Journal

        I figured he had a Ford and was used to stopping due to breakage.

    • (Score: 0) by Anonymous Coward on Saturday May 18 2019, @10:07AM

      by Anonymous Coward on Saturday May 18 2019, @10:07AM (#844974)

      I knew a guy had a Honda Prelude with four wheel steering. He absolutely loved it. The maintenance on the car was mildly high, but the real killer was finding mechanics who knew how to deal with it.

  • (Score: 2) by All Your Lawn Are Belong To Us on Friday May 17 2019, @02:57PM

    by All Your Lawn Are Belong To Us (6553) on Friday May 17 2019, @02:57PM (#844722) Journal

    The NTSB site is nicely navigable when it comes to finding reports (actually Google gave up the link easily).

    NTSB's preliminary report [ntsb.gov] and it's worth a read on its own.

    --
    This sig for rent.
  • (Score: 2) by Phoenix666 on Friday May 17 2019, @06:58PM

    by Phoenix666 (552) on Friday May 17 2019, @06:58PM (#844811) Journal

    The driver was probably holding the steering wheel wrong [knowyourmeme.com].

    --
    Washington DC delenda est.
  • (Score: 2) by chewbacon on Friday May 17 2019, @10:38PM (1 child)

    by chewbacon (1032) on Friday May 17 2019, @10:38PM (#844865)

    Many many more deaths occur hourly in non-autonomous vehicle crashes.

    • (Score: 2) by Pslytely Psycho on Saturday May 18 2019, @04:39AM

      by Pslytely Psycho (1218) on Saturday May 18 2019, @04:39AM (#844933)

      I don't see where that is relevant to the story. After all that is 'the normal state of affairs.'
      And as a former driver trainer IMHO this crash simply would not of occurred in a non-autonomous vehicle with an undistracted driver with reasonable competence.

      --
      Alex Jones lawyer inspires new TV series: CSI Moron Division.
(1)