Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by martyb on Friday May 17 2019, @09:48AM   Printer-friendly
from the keep-your-eyes-on-the-road-and-your-hands-on-the-wheel dept.

Tesla's advanced driver assist system, Autopilot, was active when a Model 3 driven by a 50-year-old Florida man crashed into the side of a tractor-trailer truck on March 1st, the National Transportation Safety Board (NTSB) states in a report released on Thursday. Investigators reviewed video and preliminary data from the vehicle and found that neither the driver nor Autopilot "executed evasive maneuvers" before striking the truck.

[...] The driver, Jeremy Beren Banner, was killed in the crash. It is at least the fourth fatal crash of a Tesla vehicle involving Autopilot.

This crash is eerily similar to another one involving a Tesla in 2016 near Gainesville, Florida. In that incident, Joshua Brown was killed when his Model S sedan collided with a semitrailer truck on a Florida highway in May 2016, making him the first known fatality in a semi-autonomous car.

The National Highway Traffic Safety Administration (NHTSA) determined that a "lack of safeguards" contributed to Brown's death. Meanwhile, today's report is just preliminary, and the NTSB declined to place blame on anyone.

Source: The Verge

Also at Ars Technica.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Friday May 17 2019, @11:32AM (10 children)

    by Anonymous Coward on Friday May 17 2019, @11:32AM (#844663)

    By now, students of self-driving car tech know that Tesla is using its customers for beta testing. There are bound to be a few casualties along the golden road to the autonomous future.

    Of course, not all the *customers* may realize this.

  • (Score: 1, Disagree) by Anonymous Coward on Friday May 17 2019, @12:27PM (7 children)

    by Anonymous Coward on Friday May 17 2019, @12:27PM (#844676)

    BUT can you really blame Tesla when the NHTSA, NTSB, etc did nothing to stop them placing an untested piece of technology on the roads like this?

    I mean assisted cruise control was one thing, but anything more advanced should have been required to advertise itself as the highest currently documented safe method of assisted cruise control and any 'autopilot' esque features should have required the drafting of safety and design documents before it was ever advertised or allowed to operate like a full lane assist, acceleration and deceleration system for any vehicle operating on public roads.

    Personally I think this is the turning point for Tesla (not the beginning of the end, which has already been happening for some time.) After this failure, lawsuits are going to come out of the woodwork and federal scrutiny is going to rain down on them, if not under this administration, then under the next. And when that time comes, it will be just like the 1960s all over again, and the time of incumbent automative manufacturers will return.

    • (Score: 4, Insightful) by All Your Lawn Are Belong To Us on Friday May 17 2019, @02:40PM (6 children)

      by All Your Lawn Are Belong To Us (6553) on Friday May 17 2019, @02:40PM (#844713) Journal

      Yes, you really can blame Tesla.
      And the NHTSA and NTSB, and Congress for all closing their eyes to it.
      There's more than enough blame to go around for a system that, by design, is unnecessary. As in, one is always supposed to remain attentive to the road at all times when on autopilot.... so why have it?
      How odd that two commercial airline crashes ground an entire fleet of aircraft. Two crashes of cars with autopilots (plus killing of pedestrians,) and the regulatory response is, "meh".

      --
      This sig for rent.
      • (Score: 4, Interesting) by Knowledge Troll on Friday May 17 2019, @02:58PM (4 children)

        by Knowledge Troll (5948) on Friday May 17 2019, @02:58PM (#844723) Homepage Journal

        one is always supposed to remain attentive to the road at all times when on autopilot.... so why have it?

        Only once have I seen a Tesla fan get this one right. They knew exactly what they were doing: supervising and training a robot. He was very clear and knew exactly what the task was. He wasn't using the car to automatically get someplace he was truly supervising the car and providing feedback hoping that it improves the Tesla fleet as a whole.

        That's one out of dozens of fans I've seen.

        • (Score: 0) by Anonymous Coward on Friday May 17 2019, @05:13PM (3 children)

          by Anonymous Coward on Friday May 17 2019, @05:13PM (#844778)

          Yes, I know one couple that also use their Tesla Model S Autopilot this way. Both are "software royalty", have top level research jobs at top university artificial intelligence labs.

          All the other Tesla owners I know...well, let's just say that they won't become my close friends--I'd hate to lose them but don't want the heartache of losing a really close friend this way.

          • (Score: 5, Insightful) by edIII on Friday May 17 2019, @09:04PM (2 children)

            by edIII (791) on Friday May 17 2019, @09:04PM (#844836)

            This I understand. I don't have a choice though. I've already become close to people that have lane change assist technology as well as this auto-pilot crap, and not in Tesla's either.

            What makes it concerning is that they have an attitude about it that they can stop paying attention. One friend bragged saying he didn't lift his eyes from his movie playing on a tablet from Los Angeles to Las Vegas. They think these things are fucking Johnny Cabs from Total Recall, and they are not anywhere close to that level of tech yet. Musk can burn in hell for taking advantage of these people as his personal guinea pigs for testing.

            It was insanely premature to allow this tech on the road. We need 5 years of extensive testing with hundreds of vehicles on a closed testing track before we can confidently say anything about the performance of the tech. They've done, IMO, less than 1% of the regression testing required. What's worse, is that failures should mandate legal disabling of these features until they go through several more cycles of development and testing.

            ALL OF THAT is an ideal world. We've not even got to the security concerns yet, and there was an article here recently about placing special stickers on the road to deliberately confuse these systems. So they not only need to perform extensive regression testing (all kinds of vehicles and situations tested), but also need to account for malicious activity. Unless we want a cyber-terrorist to throw an AI-bombs onto the freeway.

            Reminds of that scientist in Spider Man, "We need to go back to formula".

            --
            Technically, lunchtime is at any moment. It's just a wave function.
            • (Score: 4, Insightful) by coolgopher on Saturday May 18 2019, @12:53AM (1 child)

              by coolgopher (1157) on Saturday May 18 2019, @12:53AM (#844894)

              It was insanely premature to allow this tech on the road.

              I'd also say that at least a quarter of the human drivers on our roads were allowed on insanely prematurely.

              In the end, I figure it kinda squares out. Humans do really dumb and dangerous things, robots do really dumb and dangerous things. *shrug* As a driver it's my job to be prepared for others doing dumb and dangerous things, and ideally not inflict such dumb and dangerous stuff on others.

              • (Score: 2) by Bot on Saturday May 18 2019, @09:45PM

                by Bot (3902) on Saturday May 18 2019, @09:45PM (#845138) Journal

                >robots do really dumb and dangerous things

                systemd made me do it

                --
                Account abandoned.
      • (Score: 2) by etherscythe on Sunday May 19 2019, @05:05PM

        by etherscythe (937) on Sunday May 19 2019, @05:05PM (#845268) Journal

        <sarcasm>Horses work great getting you from one place to another - why have cars at all?</sarcasm>

        I think the purpose of the driver assist is pretty great - reduce deaths in traffic due to human error, which is one of the biggest killers in first world countries today.

        That said, I bet we have a lot to agree on: I also think using the name "autopilot" for a system which is essentially only a safety watchdog with secondary system control was a huge mistake on Tesla's part, especially in beta status. Don't call it autopilot until the car is literally driving completely by itself, and we've tested it well enough to feel at least somewhat safe with it legally. The current name gives people completely the wrong impression how how the feature is meant to be used. The documentation says one thing, but naming it Autopilot essentially says, "yeah, we only included that language because we were required to by law. We here in the cool kids club know that what it's really for is showing off to your girlfriend/drinking buddies how awesome you are by letting the car drive by itself and not pay attention, because you're so rad you can afford that car of the future."

        One of my coworkers recently bought an older used Model S, and I'm a little disturbed by how casual he is in letting the car do what it wants to do on Autopilot v1 hardware. On the other hand, I'm also impressed by the collision which it has already avoided by slightly distracted driving (not watching several cars ahead) when he still had his hands on the wheel and eyes on the road and the car in front of him slammed on its brakes suddenly in response to slowing traffic ahead. This speaks to what the vision really was - help people look out for themselves, because digital billboards shift, solicitors/homeless stand with signs dangerously close to the road, rising/setting sun glares right in your eyeballs, or you just didn't sleep well last night, and any number of other temporary debilitations or distractions occur which really increase the risk of injury on the road.

        Tesla was bound to make some mistakes, but I think some of these collisions were easily preventable had this branding decision not been made recklessly in the name of marketing, and overexaggerating on a feature just to stick another thumb in the eye of the established automakers. Musk is a central figure in that aspect of the business, and I will absolutely call him out on it.

        --
        "Fake News: anything reported outside of my own personally chosen echo chamber"
  • (Score: 0) by Anonymous Coward on Friday May 17 2019, @07:42PM (1 child)

    by Anonymous Coward on Friday May 17 2019, @07:42PM (#844821)

    How about the issue of false advertising. A "driver assist" is much different than "auto-pilot".

    • (Score: 2) by Bot on Saturday May 18 2019, @09:48PM

      by Bot (3902) on Saturday May 18 2019, @09:48PM (#845140) Journal

      false advertising alright but let us not assume Musk's real motives. For example, a factually accurate slogan, burma shave style.

      Tesla

      increasing the average IQ of the nation

      crash by crash

      --
      Account abandoned.