Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 18 submissions in the queue.
posted by Fnord666 on Wednesday September 13 2017, @07:51PM   Printer-friendly
from the tragic-events dept.

http://abcnews.go.com/Technology/teslas-semi-autonomous-system-contributed-deadly-crash-feds/story?id=49795839

Federal investigators announced Tuesday that the design of Tesla's semiautonomous driving system allowed the driver of a Tesla Model S in a fatal 2016 crash with a semi-truck to rely too heavily on the car's automation.

"Tesla allowed the driver to use the system outside of the environment for which it was designed," said National Transportation Safety Board Chairman Robert Sumwalt. "The system gave far too much leeway to the driver to divert his attention."

The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation — or Autopilot, as Tesla calls the system. Tesla's system design was declared a contributing factor.

[...] A Tesla spokesperson provided a statement to ABC News that read, "We appreciate the NTSB's analysis of last year's tragic accident, and we will evaluate their recommendations as we continue to evolve our technology. We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times."

According to The Associated Press, members of Brown's family said on Monday that they do not blame the car or the Autopilot system for his death.

A National Highway Traffic Safety Administration report on the crash can be found here. The NTSB has yet not published its full report; a synopsis of it can be found here.

Also at The Verge and CNN


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by takyon on Wednesday September 13 2017, @08:12PM (3 children)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Wednesday September 13 2017, @08:12PM (#567438) Journal

    The semi-autonomous system is only semi-culpable.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 5, Informative) by frojack on Wednesday September 13 2017, @09:06PM (1 child)

      by frojack (1554) on Wednesday September 13 2017, @09:06PM (#567468) Journal

      And apparently headlines are only semi believable as well.

      The board's report declares the primary probable cause of the collision as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation.

      Two humans fucked up.

      Tesla gets the blame in the headlines.

      Driver ignored or silenced the warnings, 7 times according to some reports.

      Driver monitoring. Tesla monitors driver engagement through the interactions with the steering
      wheel, turn signal, and TACC speed setting stalk. If the system does not detect the driver’s hands on the
      steering wheel (assessed using microtorque measurements) or other signs of driver engagement for
      periods of time that vary depending on road class, vehicle speed, road curvature, and traffic conditions, an
      escalating series of warnings is presented. The warnings start with a visual alert indicating that hands on
      the steering wheel are required. If the driver does not respond to the visual warning, an audible chime is
      sounded after 15 seconds. A more pronounced chime is initiated if the driver does not respond after
      another 10 seconds. If the driver fails to respond to the third alert stage within five seconds, the system
      gradually slows the vehicle while maintaining position in the lane.

      Per NHTSA Report.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 2) by choose another one on Thursday September 14 2017, @07:58AM

        by choose another one (515) Subscriber Badge on Thursday September 14 2017, @07:58AM (#567684)

        Two humans fucked up.
        Tesla^H^H^H^H vehicle manufacturer gets the blame in the headlines.
        Driver ignored or silenced the warnings, 7 times according to some reports.

        Pretty sure the aviation industry has been here many times before...

    • (Score: 1) by fustakrakich on Wednesday September 13 2017, @10:30PM

      by fustakrakich (6150) on Wednesday September 13 2017, @10:30PM (#567513) Journal

      No, it's semi-innocent. It was supposed to taser the driver when he didn't heed the warnings

      --
      La politica e i criminali sono la stessa cosa..
  • (Score: 2, Informative) by Revek on Wednesday September 13 2017, @08:15PM (12 children)

    by Revek (5022) on Wednesday September 13 2017, @08:15PM (#567440)

    No, anyone who blames the system for the driver who didn't use it correctly is a full time idiot.

    --
    This page was generated by a Swarm of Roaming Elephants
    • (Score: 5, Insightful) by n1 on Wednesday September 13 2017, @08:36PM (9 children)

      by n1 (993) on Wednesday September 13 2017, @08:36PM (#567447) Journal

      I agree with your sentiment, but Tesla and Musk have been very keen to promote Autopilot capabilities beyond what is really appropriate, in my opinion.

      When this crash happened, Tesla website was promoting autopilot as 'automatic steering, speed, lane changing and parking'

      Around the same time elsewhere...

      Tesla removed a Chinese term for “self-driving” from its China website [reuters.com] after a driver in Beijing who crashed in “autopilot” mode complained that the car maker overplayed the function’s capability and misled buyers.

      More recently Elon musk boasted [thestreet.com] on Twitter about watching the eclipse through the glass roof on a Model S whilst using Autopilot.

      Musk told reporters that the Model S was “probably better than humans at this point in highway driving”. Before the updated autopilot was released, he said that the car was “almost able to go [between San Francisco and Seattle] without touching the controls at all”.

      Talulah Riley, Musk’s [ex]wife, shared and deleted an Instagram video of herself driving on the highway between Los Angeles and San Diego without holding the wheel. [theguardian.com]

      This PR vs Terms & Conditions gets worse when theres the ever present promotion of next week your car could get an update that makes autopilot even better/worse than it was and the intended use of the system changes while you sleep.

      • (Score: 3, Touché) by DannyB on Wednesday September 13 2017, @09:02PM (8 children)

        by DannyB (5839) Subscriber Badge on Wednesday September 13 2017, @09:02PM (#567465) Journal

        Yep. Autopilot may indeed be better than some humans at highway driving. But that doesn't make it a replacement for paying attention to the road and keeping your hand on the steering wheel while the driver is sending tweets.

        --
        To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
        • (Score: 4, Insightful) by frojack on Wednesday September 13 2017, @09:22PM (7 children)

          by frojack (1554) on Wednesday September 13 2017, @09:22PM (#567475) Journal

          The Straight Crossing Path (see page 3 of NHTSA report [nhtsa.gov] such as a car running a red light is almost impossible for any autonomous driving software to detect, especially if includes multiple lines of stopped traffic, or trees, or similar sight line issues. The crossing car is in the sight lines of the sensors for far too short a time period.

          No autopilot system on the road has been certified for this, even though BMW and Tesla and Volvo offer autopilot systems.

          Which makes it hard to believe that fully autonomous driverless cars are going to fare well in this situation either.

          People need only watch a few russian dash cam videos on YouTube to see just how frequently this happens.

          --
          No, you are mistaken. I've always had this sig.
          • (Score: 3, Insightful) by Nerdfest on Wednesday September 13 2017, @09:36PM

            by Nerdfest (80) on Wednesday September 13 2017, @09:36PM (#567484)

            People suck just as bad at that scenario though, maybe worse.

          • (Score: 3, Insightful) by BasilBrush on Wednesday September 13 2017, @10:09PM (5 children)

            by BasilBrush (3994) on Wednesday September 13 2017, @10:09PM (#567502)

            The more autonomous systems on the road, the less red lights will be jumped.

            --
            Hurrah! Quoting works now!
            • (Score: 3, Interesting) by frojack on Thursday September 14 2017, @12:45AM (4 children)

              by frojack (1554) on Thursday September 14 2017, @12:45AM (#567545) Journal

              Maybe. Maybe not. Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do. Remember, to error is human. To really fuck things up you need a computer.

              I can see improvement in this area when all cars communicate with each other indicating speed, location, and direction.
              Tinfoil sales are likely to hit boom times. But in-dash warnings of unsafe crossing conditions (even when light is green) will provide safety benefits to human drivers as well as autonomous cars.

              --
              No, you are mistaken. I've always had this sig.
              • (Score: 2) by BasilBrush on Thursday September 14 2017, @01:44AM (1 child)

                by BasilBrush (3994) on Thursday September 14 2017, @01:44AM (#567570)

                Because the number one objective is safety.

                --
                Hurrah! Quoting works now!
                • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @08:16AM

                  by Anonymous Coward on Thursday September 14 2017, @08:16AM (#567694)

                  It the number one objective is safety, they wouldn't have built a broken system that requires humans do the one thing that humans are especially bad at, and computers are really good at: Just sitting there, monitoring traffic, ready to take over at short notice.

              • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @04:02AM

                by Anonymous Coward on Thursday September 14 2017, @04:02AM (#567628)

                For starters, there is no AI.

              • (Score: 2) by DeathMonkey on Thursday September 14 2017, @07:46PM

                by DeathMonkey (1380) on Thursday September 14 2017, @07:46PM (#568039) Journal

                Why wouldn't the AI in the software know about the 2 second all way red, or the extended yellow, and try to take advantage of it just like drivers do.

                Because auto manufacturers aren't stupid. Well...THAT stupid anyway!

    • (Score: 2) by tfried on Thursday September 14 2017, @08:16AM

      by tfried (5534) on Thursday September 14 2017, @08:16AM (#567692)

      Depends on the type of blame you have in mind.

      From a perspective of guilt, yes, clearly the driver is at fault for misusing the system.

      From a perspective of road safety, you will realize that users will be users, and a system that can be misused will be misused. But in particular, from that perspective it also makes sense to consider, whether it is really a good idea to have a system that makes driving even more boring, while still relying on the driver to pay full attention.

      I reckon the short term answer (before there is a fully-autonomous system) will be to make the system even more obnoxious in watching the driver for signs of inattention. But that may also make the system much less popular...

    • (Score: 3, Informative) by theluggage on Thursday September 14 2017, @10:37AM

      by theluggage (1797) on Thursday September 14 2017, @10:37AM (#567724)

      No, anyone who blames the system for the driver who didn't use it correctly is a full time idiot.

      Blame and responsibility doesn't obey a conservation law: The car driver can be responsible and the truck driver can be responsible and Tesla can be responsible. Lawyers and insurance companies like there to be one victim and one villain - for their own profit and convenience - but when it comes to learning lessons for the future you really need to look at the big picture.

      I don't think anyone here is saying that the driver wasn't stupid to ignore the warnings - but would it have made any difference if his hands were on the wheel unless his mind was on the job?

      There's an inconvenient truth about self-driving: it won't be ready until it is ready. Any system in which the car drives itself, but expects Joe Public to stay alert and intervene when the computer screws up is an accident waiting to happen. Telsa's problem is that they want to crowdsource the alpha testing of their system rather than relying on expensive test drivers.

  • (Score: 3, Interesting) by crafoo on Wednesday September 13 2017, @09:01PM (2 children)

    by crafoo (6639) on Wednesday September 13 2017, @09:01PM (#567463)

    Until they take away the steering, throttle, and brake controls the driver is responsible for the safe operation of the vehicle.

    This is a case of a marketing department being directly liable for misinformation that lead to a death.

    • (Score: 4, Interesting) by bob_super on Wednesday September 13 2017, @09:29PM

      by bob_super (1357) on Wednesday September 13 2017, @09:29PM (#567479)

      Let's consider this highly informative title: "Tesla's Semiautonomous System Contributed to Fatal Crash"

      I think stepping into the car, and even getting out of bed that day, also led to fatal crash. Using a vehicle led to fatal crash. Having a glorified cruise control on can lead to a crash.
      The person behind the wheel, who acted as if 2016 tech could be trusted to handle hazards on US public roads, definitely contributed to the crash, too. But a significant amount more.

    • (Score: 2) by MostCynical on Wednesday September 13 2017, @10:17PM

      by MostCynical (2589) on Wednesday September 13 2017, @10:17PM (#567506) Journal

      And coca cola is directly responsible for any buyer not having the lifestyle and activites portrayed in their advertisements.

      Hype has been permitted in advertising for as long as there has been advertising (before it even gad a name.. )

      Sure, limit the hype to 'reality', but don't blame the marketing and advertising dudes, unless you also want to restrict ALL sales reps to "the truth". (good luck with that)

      --
      "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
  • (Score: 3, Informative) by Virindi on Wednesday September 13 2017, @09:05PM (13 children)

    by Virindi (3484) on Wednesday September 13 2017, @09:05PM (#567467)

    As has been said before, they are sending mixed messages. Out of one side of their mouth, they load up the user with disclaimers and their PR team carefully crafts statements about how it is very important to always pay attention. But then out of the other side, they name the feature "Autopilot"* and then the marketing team produces a bunch of hype about how it is so advanced and it is going to be fully autonomous really soon now.

    It might be different if the company was speaking with one voice. But this way, they are just being slimy.

    *As has been said before by others, yes I know that airline pilots have to pay attention when autopilot is enabled. But this is about public perception of the term, not reality.

    • (Score: 3, Informative) by frojack on Wednesday September 13 2017, @09:34PM (1 child)

      by frojack (1554) on Wednesday September 13 2017, @09:34PM (#567482) Journal

      You overstate the case.

      You seem to gloss over the fact that the cars enforce keeping your hands on the wheel most of the time, and only allow short periods of hands off driving. (The length of time you can be hands off is adjusted according to road conditions. You could probably cross Nevada on US Route 50 and never touch the wheel)

      So its not at all like people can watch the advertising and then drive to work with their full of coffee and cell phone for the entire trip.

      Since the May crash, Tesla has shortened the period you can be hands free, and will slow the vehicle to a crawl, and deny use of autopilot if the driver does not keep at least one hand on the wheel as the system demands.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 2) by Virindi on Wednesday September 13 2017, @09:45PM

        by Virindi (3484) on Wednesday September 13 2017, @09:45PM (#567487)

        Since the May crash, Tesla has shortened the period you can be hands free, and will slow the vehicle to a crawl, and deny use of autopilot if the driver does not keep at least one hand on the wheel as the system demands.

        I hadn't heard about that, that is positive.

        The marketing hype still seems to be full steam though.

    • (Score: 2) by JNCF on Wednesday September 13 2017, @09:40PM (10 children)

      by JNCF (4317) on Wednesday September 13 2017, @09:40PM (#567485) Journal

      *As has been said before by others, yes I know that airline pilots have to pay attention when autopilot is enabled. But this is about public perception of the term, not reality.

      If you can afford a Tesla, your phone has access to a plethora of dictionaries. There's no need to start redefining words.

      • (Score: 3, Informative) by Virindi on Wednesday September 13 2017, @09:48PM (7 children)

        by Virindi (3484) on Wednesday September 13 2017, @09:48PM (#567490)

        If you can afford a Tesla, your phone has access to a plethora of dictionaries. There's no need to start redefining words.

        What? How many people with a Tesla do you think actually go to a dictionary to verify the definition of "autopilot"? That's completely unrealistic. I am talking here about the impression the name gives to Joe Public, not the dictionary definition.

        • (Score: 2) by JNCF on Wednesday September 13 2017, @09:55PM

          by JNCF (4317) on Wednesday September 13 2017, @09:55PM (#567495) Journal

          This is why we can't have nice words.

        • (Score: 1, Disagree) by khallow on Wednesday September 13 2017, @11:43PM (5 children)

          by khallow (3766) Subscriber Badge on Wednesday September 13 2017, @11:43PM (#567531) Journal

          I am talking here about the impression the name gives to Joe Public, not the dictionary definition.

          Think of it as evolution in action. If a person gets killed or kills someone because they didn't bother to learn to drive a car for which they spent at least $68k, then fuck them. I get that everyone is not smart all the time, but we have to draw the line at high levels of negligence.

          • (Score: 4, Insightful) by Virindi on Wednesday September 13 2017, @11:47PM (2 children)

            by Virindi (3484) on Wednesday September 13 2017, @11:47PM (#567532)

            Unfortunately, I drive the same roads as these people. So my life is also at risk by their behavior.

            If they were doing it on their own farm road where there weren't any other drivers, I'd say have at it.

            • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @12:09AM (1 child)

              by Anonymous Coward on Thursday September 14 2017, @12:09AM (#567539)

              He understands. He wrote or kills someone. If that someone is you, it's still evolution in action!

              • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @02:49AM

                by Anonymous Coward on Thursday September 14 2017, @02:49AM (#567600)

                Teela Brown ?

          • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @01:31AM (1 child)

            by Anonymous Coward on Thursday September 14 2017, @01:31AM (#567566)

            If a person gets killed or kills someone because they didn't bother to learn to drive a car for which they spent at least $68k, then fuck them.

            If they kill someone else (without killing themselves) due to their incompetence, and then you fuck them, that person might successfully produce offspring.

            It would be better not to fuck such a person.

            • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @04:44PM

              by Anonymous Coward on Thursday September 14 2017, @04:44PM (#567915)

              Are you from another planet? In our culture, the fantasized fucking of persons convicted of negligent homicide (among many other offenses), is universally male-on-male. I really don't think GP needs to worry about reproduction, although it might be good if they could stop fantasizing about prison rape.

      • (Score: 2) by theluggage on Thursday September 14 2017, @11:09AM (1 child)

        by theluggage (1797) on Thursday September 14 2017, @11:09AM (#567733)

        I wasn't aware that pilots were expected to sit with their hands on the stick and their eyes on the sky ahead while the plane was on autopilot, so they could slam on the brakes if a 747 suddenly pulled out from behind a cloud...

        But then, driving is not flying. Insofar as it pertains to avoiding crashing into other planes or pedestrians, flying is easier than driving (other aspects may be a bit harder). Airline pilots report a "near miss" if another plane passes within a quarter of a mile of them. Airline pilots flying through crowded airspace have support from air traffic control. Airline pilots are paid to fly planes - they're not rushing to get to work so they can start getting paid. Airline pilots on large planes typically have a co-pilot to watch their back.

        An airline-style autopilot (which basically kept you on a particular bearing) would be as much use as an inflatable dartboard in a car - so there's not much point arguing about what "autopilot" might mean in the context of a car.

        • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @04:48PM

          by Anonymous Coward on Thursday September 14 2017, @04:48PM (#567921)

          Sure, a literal airplane autopilot wouldn't do much good in a car, but there is a pretty clear analogy between "maintain constant heading" in the air and "stay in lane" on the road.

  • (Score: 3, Informative) by meustrus on Wednesday September 13 2017, @09:50PM (5 children)

    by meustrus (4961) on Wednesday September 13 2017, @09:50PM (#567491)

    It's pretty clear to me what the NTSB is implying here. Tesla loudly warning the driver multiple times to take the wheel isn't enough. The system "allowed the driver...to rely too heavily on the car's automation" [emphasis mine].

    Therefore, it is up to Tesla to actively prevent drivers from ignoring the warnings. They need to construct a perfect system that also happens to be completely closed off from driver input. And in order to protect that perfect system, it needs to be completely closed off from driver modification.

    The first freedom to go is always the freedom to (stupidly) kill oneself. What inevitably follows is the freedom to tell the nanny to go fuck itself so one can (stupidly) kill oneself anyway.

    --
    If there isn't at least one reference or primary source, it's not +1 Informative. Maybe the underused +1 Interesting?
    • (Score: 2) by takyon on Wednesday September 13 2017, @10:18PM (4 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Wednesday September 13 2017, @10:18PM (#567507) Journal

      If the car requires a hand on the wheel to stay at full speed in autopilot mode, as frojack says, why not include an electric shock along with warning lights? Make it strong enough to be noticeable but not so much that you abandon the wheel. Or just do it one time instead of twice a second or whatever.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 3, Funny) by JNCF on Wednesday September 13 2017, @11:37PM (3 children)

        by JNCF (4317) on Wednesday September 13 2017, @11:37PM (#567529) Journal

        Or after the third warning is ignored, Hastings the nefarious driver into the nearest tree. "BOSTON BRAKES INITIATING IN 3... 2...."

        • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @12:48AM (2 children)

          by Anonymous Coward on Thursday September 14 2017, @12:48AM (#567547)

          Wow, I've never heard this verb "Hastings".... I kinda get the context but what exactly does it mean? Where does this come from? Is this a reference to how the Hastings store chain ran out of money and had to shut down, or what?

          UrbanDictionary didn't explain "Hastings" but did have "Boston brakes": http://www.urbandictionary.com/define.php?term=Boston+Brakes [urbandictionary.com]

          • (Score: 4, Informative) by JNCF on Thursday September 14 2017, @01:05AM (1 child)

            by JNCF (4317) on Thursday September 14 2017, @01:05AM (#567555) Journal

            Michael Hastings was a journalist who died in a high-speed car wreck with a tree. [wikipedia.org] He asked to borrow a car belonging to his neighbor just days prior, explaining that he feared his vehicle had been tampered with. He had also told some friends that he was doing a big story on the government, and was going to "go off the radar." After his death the FBI initially lied about having a file on him, but has since released what they claim to be his file. Richard Clarke said his death was "consistent with a car cyber attack." There is no smoking gun, unless you interpret this grainy video [youtube.com] as showing an explosion prior to collision (I don't, I think they probably murdered him with a car hack sans explosives, but maybe the coroner is right and he was just smoking too many marijuanas).

            • (Score: 0) by Anonymous Coward on Thursday September 14 2017, @01:23AM

              by Anonymous Coward on Thursday September 14 2017, @01:23AM (#567561)

              Wow, I hadn't heard about that. Thanks for answering my question.

(1)